Air Miles Calculator logo

How far is Sanming from Tampa, FL?

The distance between Tampa (Tampa International Airport) and Sanming (Shaxian Airport) is 8454 miles / 13605 kilometers / 7346 nautical miles.

Tampa International Airport – Shaxian Airport

Distance arrow
8454
Miles
Distance arrow
13605
Kilometers
Distance arrow
7346
Nautical miles
Flight time duration
16 h 30 min
CO2 emission
1 065 kg

Search flights

Distance from Tampa to Sanming

There are several ways to calculate the distance from Tampa to Sanming. Here are two standard methods:

Vincenty's formula (applied above)
  • 8454.022 miles
  • 13605.429 kilometers
  • 7346.344 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8442.183 miles
  • 13586.377 kilometers
  • 7336.056 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tampa to Sanming?

The estimated flight time from Tampa International Airport to Shaxian Airport is 16 hours and 30 minutes.

Flight carbon footprint between Tampa International Airport (TPA) and Shaxian Airport (SQJ)

On average, flying from Tampa to Sanming generates about 1 065 kg of CO2 per passenger, and 1 065 kilograms equals 2 349 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tampa to Sanming

See the map of the shortest flight path between Tampa International Airport (TPA) and Shaxian Airport (SQJ).

Airport information

Origin Tampa International Airport
City: Tampa, FL
Country: United States Flag of United States
IATA Code: TPA
ICAO Code: KTPA
Coordinates: 27°58′31″N, 82°31′59″W
Destination Shaxian Airport
City: Sanming
Country: China Flag of China
IATA Code: SQJ
ICAO Code: ZSSM
Coordinates: 26°25′34″N, 117°50′0″E