Air Miles Calculator logo

How far is Jinghong from Pago Pago?

The distance between Pago Pago (Pago Pago International Airport) and Jinghong (Jinghong Xishuangbanna Gasa Airport) is 6496 miles / 10454 kilometers / 5645 nautical miles.

Pago Pago International Airport – Jinghong Xishuangbanna Gasa Airport

Distance arrow
6496
Miles
Distance arrow
10454
Kilometers
Distance arrow
5645
Nautical miles

Search flights

Distance from Pago Pago to Jinghong

There are several ways to calculate the distance from Pago Pago to Jinghong. Here are two standard methods:

Vincenty's formula (applied above)
  • 6495.845 miles
  • 10454.050 kilometers
  • 5644.735 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6494.040 miles
  • 10451.144 kilometers
  • 5643.166 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pago Pago to Jinghong?

The estimated flight time from Pago Pago International Airport to Jinghong Xishuangbanna Gasa Airport is 12 hours and 47 minutes.

Flight carbon footprint between Pago Pago International Airport (PPG) and Jinghong Xishuangbanna Gasa Airport (JHG)

On average, flying from Pago Pago to Jinghong generates about 785 kg of CO2 per passenger, and 785 kilograms equals 1 730 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pago Pago to Jinghong

See the map of the shortest flight path between Pago Pago International Airport (PPG) and Jinghong Xishuangbanna Gasa Airport (JHG).

Airport information

Origin Pago Pago International Airport
City: Pago Pago
Country: American Samoa Flag of American Samoa
IATA Code: PPG
ICAO Code: NSTU
Coordinates: 14°19′51″S, 170°42′36″W
Destination Jinghong Xishuangbanna Gasa Airport
City: Jinghong
Country: China Flag of China
IATA Code: JHG
ICAO Code: ZPJH
Coordinates: 21°58′26″N, 100°45′36″E