Air Miles Calculator logo

How far is Magway from San Luis Potosi?

The distance between San Luis Potosi (San Luis Potosí International Airport) and Magway (Magway Airport) is 9325 miles / 15008 kilometers / 8104 nautical miles.

San Luis Potosí International Airport – Magway Airport

Distance arrow
9325
Miles
Distance arrow
15008
Kilometers
Distance arrow
8104
Nautical miles
Flight time duration
18 h 9 min
Time Difference
12 h 30 min
CO2 emission
1 197 kg

Search flights

Distance from San Luis Potosi to Magway

There are several ways to calculate the distance from San Luis Potosi to Magway. Here are two standard methods:

Vincenty's formula (applied above)
  • 9325.438 miles
  • 15007.838 kilometers
  • 8103.584 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9316.378 miles
  • 14993.256 kilometers
  • 8095.711 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Luis Potosi to Magway?

The estimated flight time from San Luis Potosí International Airport to Magway Airport is 18 hours and 9 minutes.

Flight carbon footprint between San Luis Potosí International Airport (SLP) and Magway Airport (MWQ)

On average, flying from San Luis Potosi to Magway generates about 1 197 kg of CO2 per passenger, and 1 197 kilograms equals 2 639 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Luis Potosi to Magway

See the map of the shortest flight path between San Luis Potosí International Airport (SLP) and Magway Airport (MWQ).

Airport information

Origin San Luis Potosí International Airport
City: San Luis Potosi
Country: Mexico Flag of Mexico
IATA Code: SLP
ICAO Code: MMSP
Coordinates: 22°15′15″N, 100°55′51″W
Destination Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E