Air Miles Calculator logo

How far is Piseo-ri (Muan) from Wellington?

The distance between Wellington (Wellington International Airport) and Piseo-ri (Muan) (Muan International Airport) is 6084 miles / 9792 kilometers / 5287 nautical miles.

Wellington International Airport – Muan International Airport

Distance arrow
6084
Miles
Distance arrow
9792
Kilometers
Distance arrow
5287
Nautical miles

Search flights

Distance from Wellington to Piseo-ri (Muan)

There are several ways to calculate the distance from Wellington to Piseo-ri (Muan). Here are two standard methods:

Vincenty's formula (applied above)
  • 6084.487 miles
  • 9792.033 kilometers
  • 5287.275 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6101.253 miles
  • 9819.015 kilometers
  • 5301.844 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wellington to Piseo-ri (Muan)?

The estimated flight time from Wellington International Airport to Muan International Airport is 12 hours and 1 minutes.

Flight carbon footprint between Wellington International Airport (WLG) and Muan International Airport (MWX)

On average, flying from Wellington to Piseo-ri (Muan) generates about 728 kg of CO2 per passenger, and 728 kilograms equals 1 605 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Wellington to Piseo-ri (Muan)

See the map of the shortest flight path between Wellington International Airport (WLG) and Muan International Airport (MWX).

Airport information

Origin Wellington International Airport
City: Wellington
Country: New Zealand Flag of New Zealand
IATA Code: WLG
ICAO Code: NZWN
Coordinates: 41°19′37″S, 174°48′17″E
Destination Muan International Airport
City: Piseo-ri (Muan)
Country: South Korea Flag of South Korea
IATA Code: MWX
ICAO Code: RKJB
Coordinates: 34°59′29″N, 126°22′58″E