Air Miles Calculator logo

How far is Nanjing from Tampa, FL?

The distance between Tampa (Tampa International Airport) and Nanjing (Nanjing Lukou International Airport) is 8090 miles / 13020 kilometers / 7030 nautical miles.

Tampa International Airport – Nanjing Lukou International Airport

Distance arrow
8090
Miles
Distance arrow
13020
Kilometers
Distance arrow
7030
Nautical miles
Flight time duration
15 h 49 min
CO2 emission
1 012 kg

Search flights

Distance from Tampa to Nanjing

There are several ways to calculate the distance from Tampa to Nanjing. Here are two standard methods:

Vincenty's formula (applied above)
  • 8090.143 miles
  • 13019.823 kilometers
  • 7030.142 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8077.338 miles
  • 12999.215 kilometers
  • 7019.015 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tampa to Nanjing?

The estimated flight time from Tampa International Airport to Nanjing Lukou International Airport is 15 hours and 49 minutes.

Flight carbon footprint between Tampa International Airport (TPA) and Nanjing Lukou International Airport (NKG)

On average, flying from Tampa to Nanjing generates about 1 012 kg of CO2 per passenger, and 1 012 kilograms equals 2 230 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tampa to Nanjing

See the map of the shortest flight path between Tampa International Airport (TPA) and Nanjing Lukou International Airport (NKG).

Airport information

Origin Tampa International Airport
City: Tampa, FL
Country: United States Flag of United States
IATA Code: TPA
ICAO Code: KTPA
Coordinates: 27°58′31″N, 82°31′59″W
Destination Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E