Air Miles Calculator logo

How far is Piseo-ri (Muan) from Uray?

The distance between Uray (Uray Airport) and Piseo-ri (Muan) (Muan International Airport) is 3201 miles / 5152 kilometers / 2782 nautical miles.

The driving distance from Uray (URJ) to Piseo-ri (Muan) (MWX) is 4627 miles / 7446 kilometers, and travel time by car is about 90 hours 40 minutes.

Uray Airport – Muan International Airport

Distance arrow
3201
Miles
Distance arrow
5152
Kilometers
Distance arrow
2782
Nautical miles

Search flights

Distance from Uray to Piseo-ri (Muan)

There are several ways to calculate the distance from Uray to Piseo-ri (Muan). Here are two standard methods:

Vincenty's formula (applied above)
  • 3201.271 miles
  • 5151.946 kilometers
  • 2781.828 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3194.916 miles
  • 5141.718 kilometers
  • 2776.306 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Uray to Piseo-ri (Muan)?

The estimated flight time from Uray Airport to Muan International Airport is 6 hours and 33 minutes.

Flight carbon footprint between Uray Airport (URJ) and Muan International Airport (MWX)

On average, flying from Uray to Piseo-ri (Muan) generates about 358 kg of CO2 per passenger, and 358 kilograms equals 790 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Uray to Piseo-ri (Muan)

See the map of the shortest flight path between Uray Airport (URJ) and Muan International Airport (MWX).

Airport information

Origin Uray Airport
City: Uray
Country: Russia Flag of Russia
IATA Code: URJ
ICAO Code: USHU
Coordinates: 60°6′11″N, 64°49′36″E
Destination Muan International Airport
City: Piseo-ri (Muan)
Country: South Korea Flag of South Korea
IATA Code: MWX
ICAO Code: RKJB
Coordinates: 34°59′29″N, 126°22′58″E