How far is Magway from Luang Namtha?
The distance between Luang Namtha (Louang Namtha Airport) and Magway (Magway Airport) is 422 miles / 679 kilometers / 367 nautical miles.
The driving distance from Luang Namtha (LXG) to Magway (MWQ) is 674 miles / 1085 kilometers, and travel time by car is about 15 hours 17 minutes.
Louang Namtha Airport – Magway Airport
Search flights
Distance from Luang Namtha to Magway
There are several ways to calculate the distance from Luang Namtha to Magway. Here are two standard methods:
Vincenty's formula (applied above)- 422.031 miles
- 679.192 kilometers
- 366.735 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 421.427 miles
- 678.221 kilometers
- 366.210 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Luang Namtha to Magway?
The estimated flight time from Louang Namtha Airport to Magway Airport is 1 hour and 17 minutes.
What is the time difference between Luang Namtha and Magway?
Flight carbon footprint between Louang Namtha Airport (LXG) and Magway Airport (MWQ)
On average, flying from Luang Namtha to Magway generates about 87 kg of CO2 per passenger, and 87 kilograms equals 192 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Luang Namtha to Magway
See the map of the shortest flight path between Louang Namtha Airport (LXG) and Magway Airport (MWQ).
Airport information
Origin | Louang Namtha Airport |
---|---|
City: | Luang Namtha |
Country: | Laos |
IATA Code: | LXG |
ICAO Code: | VLLN |
Coordinates: | 20°58′1″N, 101°24′0″E |
Destination | Magway Airport |
---|---|
City: | Magway |
Country: | Burma |
IATA Code: | MWQ |
ICAO Code: | VYMW |
Coordinates: | 20°9′56″N, 94°56′29″E |