Air Miles Calculator logo

How far is Zhangjiakou from Pago Pago?

The distance between Pago Pago (Pago Pago International Airport) and Zhangjiakou (Zhangjiakou Ningyuan Airport) is 6069 miles / 9768 kilometers / 5274 nautical miles.

Pago Pago International Airport – Zhangjiakou Ningyuan Airport

Distance arrow
6069
Miles
Distance arrow
9768
Kilometers
Distance arrow
5274
Nautical miles

Search flights

Distance from Pago Pago to Zhangjiakou

There are several ways to calculate the distance from Pago Pago to Zhangjiakou. Here are two standard methods:

Vincenty's formula (applied above)
  • 6069.311 miles
  • 9767.610 kilometers
  • 5274.087 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6074.371 miles
  • 9775.752 kilometers
  • 5278.484 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pago Pago to Zhangjiakou?

The estimated flight time from Pago Pago International Airport to Zhangjiakou Ningyuan Airport is 11 hours and 59 minutes.

Flight carbon footprint between Pago Pago International Airport (PPG) and Zhangjiakou Ningyuan Airport (ZQZ)

On average, flying from Pago Pago to Zhangjiakou generates about 726 kg of CO2 per passenger, and 726 kilograms equals 1 601 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pago Pago to Zhangjiakou

See the map of the shortest flight path between Pago Pago International Airport (PPG) and Zhangjiakou Ningyuan Airport (ZQZ).

Airport information

Origin Pago Pago International Airport
City: Pago Pago
Country: American Samoa Flag of American Samoa
IATA Code: PPG
ICAO Code: NSTU
Coordinates: 14°19′51″S, 170°42′36″W
Destination Zhangjiakou Ningyuan Airport
City: Zhangjiakou
Country: China Flag of China
IATA Code: ZQZ
ICAO Code: ZBZJ
Coordinates: 40°44′18″N, 114°55′48″E