Air Miles Calculator logo

How far is Piseo-ri (Muan) from Pago Pago?

The distance between Pago Pago (Pago Pago International Airport) and Piseo-ri (Muan) (Muan International Airport) is 5336 miles / 8588 kilometers / 4637 nautical miles.

Pago Pago International Airport – Muan International Airport

Distance arrow
5336
Miles
Distance arrow
8588
Kilometers
Distance arrow
4637
Nautical miles

Search flights

Distance from Pago Pago to Piseo-ri (Muan)

There are several ways to calculate the distance from Pago Pago to Piseo-ri (Muan). Here are two standard methods:

Vincenty's formula (applied above)
  • 5336.304 miles
  • 8587.949 kilometers
  • 4637.122 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5342.104 miles
  • 8597.282 kilometers
  • 4642.161 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pago Pago to Piseo-ri (Muan)?

The estimated flight time from Pago Pago International Airport to Muan International Airport is 10 hours and 36 minutes.

Flight carbon footprint between Pago Pago International Airport (PPG) and Muan International Airport (MWX)

On average, flying from Pago Pago to Piseo-ri (Muan) generates about 628 kg of CO2 per passenger, and 628 kilograms equals 1 384 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pago Pago to Piseo-ri (Muan)

See the map of the shortest flight path between Pago Pago International Airport (PPG) and Muan International Airport (MWX).

Airport information

Origin Pago Pago International Airport
City: Pago Pago
Country: American Samoa Flag of American Samoa
IATA Code: PPG
ICAO Code: NSTU
Coordinates: 14°19′51″S, 170°42′36″W
Destination Muan International Airport
City: Piseo-ri (Muan)
Country: South Korea Flag of South Korea
IATA Code: MWX
ICAO Code: RKJB
Coordinates: 34°59′29″N, 126°22′58″E