How far is Shijiazhuang from Tampa, FL?
The distance between Tampa (Tampa International Airport) and Shijiazhuang (Shijiazhuang Zhengding International Airport) is 7739 miles / 12455 kilometers / 6725 nautical miles.
Tampa International Airport – Shijiazhuang Zhengding International Airport
Search flights
Distance from Tampa to Shijiazhuang
There are several ways to calculate the distance from Tampa to Shijiazhuang. Here are two standard methods:
Vincenty's formula (applied above)- 7738.959 miles
- 12454.647 kilometers
- 6724.971 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7725.361 miles
- 12432.763 kilometers
- 6713.155 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tampa to Shijiazhuang?
The estimated flight time from Tampa International Airport to Shijiazhuang Zhengding International Airport is 15 hours and 9 minutes.
What is the time difference between Tampa and Shijiazhuang?
Flight carbon footprint between Tampa International Airport (TPA) and Shijiazhuang Zhengding International Airport (SJW)
On average, flying from Tampa to Shijiazhuang generates about 960 kg of CO2 per passenger, and 960 kilograms equals 2 117 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tampa to Shijiazhuang
See the map of the shortest flight path between Tampa International Airport (TPA) and Shijiazhuang Zhengding International Airport (SJW).
Airport information
Origin | Tampa International Airport |
---|---|
City: | Tampa, FL |
Country: | United States |
IATA Code: | TPA |
ICAO Code: | KTPA |
Coordinates: | 27°58′31″N, 82°31′59″W |
Destination | Shijiazhuang Zhengding International Airport |
---|---|
City: | Shijiazhuang |
Country: | China |
IATA Code: | SJW |
ICAO Code: | ZBSJ |
Coordinates: | 38°16′50″N, 114°41′49″E |