Air Miles Calculator logo

How far is Bayanhot from Pago Pago?

The distance between Pago Pago (Pago Pago International Airport) and Bayanhot (Alxa Left Banner Bayanhot Airport) is 6501 miles / 10462 kilometers / 5649 nautical miles.

Pago Pago International Airport – Alxa Left Banner Bayanhot Airport

Distance arrow
6501
Miles
Distance arrow
10462
Kilometers
Distance arrow
5649
Nautical miles

Search flights

Distance from Pago Pago to Bayanhot

There are several ways to calculate the distance from Pago Pago to Bayanhot. Here are two standard methods:

Vincenty's formula (applied above)
  • 6500.827 miles
  • 10462.068 kilometers
  • 5649.065 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6503.783 miles
  • 10466.824 kilometers
  • 5651.633 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pago Pago to Bayanhot?

The estimated flight time from Pago Pago International Airport to Alxa Left Banner Bayanhot Airport is 12 hours and 48 minutes.

Flight carbon footprint between Pago Pago International Airport (PPG) and Alxa Left Banner Bayanhot Airport (AXF)

On average, flying from Pago Pago to Bayanhot generates about 785 kg of CO2 per passenger, and 785 kilograms equals 1 731 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pago Pago to Bayanhot

See the map of the shortest flight path between Pago Pago International Airport (PPG) and Alxa Left Banner Bayanhot Airport (AXF).

Airport information

Origin Pago Pago International Airport
City: Pago Pago
Country: American Samoa Flag of American Samoa
IATA Code: PPG
ICAO Code: NSTU
Coordinates: 14°19′51″S, 170°42′36″W
Destination Alxa Left Banner Bayanhot Airport
City: Bayanhot
Country: China Flag of China
IATA Code: AXF
ICAO Code: ZBAL
Coordinates: 38°44′53″N, 105°35′18″E