Air Miles Calculator logo

How far is Magway from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Magway (Magway Airport) is 7896 miles / 12708 kilometers / 6862 nautical miles.

Salt Lake City International Airport – Magway Airport

Distance arrow
7896
Miles
Distance arrow
12708
Kilometers
Distance arrow
6862
Nautical miles
Flight time duration
15 h 27 min
Time Difference
13 h 30 min
CO2 emission
983 kg

Search flights

Distance from Salt Lake City to Magway

There are several ways to calculate the distance from Salt Lake City to Magway. Here are two standard methods:

Vincenty's formula (applied above)
  • 7896.230 miles
  • 12707.750 kilometers
  • 6861.636 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7884.301 miles
  • 12688.553 kilometers
  • 6851.271 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Magway?

The estimated flight time from Salt Lake City International Airport to Magway Airport is 15 hours and 27 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Magway Airport (MWQ)

On average, flying from Salt Lake City to Magway generates about 983 kg of CO2 per passenger, and 983 kilograms equals 2 168 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Magway

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Magway Airport (MWQ).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E