How far is Shihezi from Pago Pago?
The distance between Pago Pago (Pago Pago International Airport) and Shihezi (Shihezi Huayuan Airport) is 7564 miles / 12172 kilometers / 6573 nautical miles.
Pago Pago International Airport – Shihezi Huayuan Airport
Search flights
Distance from Pago Pago to Shihezi
There are several ways to calculate the distance from Pago Pago to Shihezi. Here are two standard methods:
Vincenty's formula (applied above)- 7563.510 miles
- 12172.289 kilometers
- 6572.510 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7564.636 miles
- 12174.101 kilometers
- 6573.489 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Pago Pago to Shihezi?
The estimated flight time from Pago Pago International Airport to Shihezi Huayuan Airport is 14 hours and 49 minutes.
What is the time difference between Pago Pago and Shihezi?
Flight carbon footprint between Pago Pago International Airport (PPG) and Shihezi Huayuan Airport (SHF)
On average, flying from Pago Pago to Shihezi generates about 935 kg of CO2 per passenger, and 935 kilograms equals 2 061 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Pago Pago to Shihezi
See the map of the shortest flight path between Pago Pago International Airport (PPG) and Shihezi Huayuan Airport (SHF).
Airport information
Origin | Pago Pago International Airport |
---|---|
City: | Pago Pago |
Country: | American Samoa |
IATA Code: | PPG |
ICAO Code: | NSTU |
Coordinates: | 14°19′51″S, 170°42′36″W |
Destination | Shihezi Huayuan Airport |
---|---|
City: | Shihezi |
Country: | China |
IATA Code: | SHF |
ICAO Code: | ZWHZ |
Coordinates: | 44°14′31″N, 85°53′25″E |