Air Miles Calculator logo

How far is Beijing from Magway?

The distance between Magway (Magway Airport) and Beijing (Beijing Nanyuan Airport) is 1855 miles / 2986 kilometers / 1612 nautical miles.

The driving distance from Magway (MWQ) to Beijing (NAY) is 2382 miles / 3833 kilometers, and travel time by car is about 44 hours 58 minutes.

Magway Airport – Beijing Nanyuan Airport

Distance arrow
1855
Miles
Distance arrow
2986
Kilometers
Distance arrow
1612
Nautical miles
Flight time duration
4 h 0 min
Time Difference
1 h 30 min
CO2 emission
205 kg

Search flights

Distance from Magway to Beijing

There are several ways to calculate the distance from Magway to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1855.467 miles
  • 2986.084 kilometers
  • 1612.356 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1856.768 miles
  • 2988.179 kilometers
  • 1613.487 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Magway to Beijing?

The estimated flight time from Magway Airport to Beijing Nanyuan Airport is 4 hours and 0 minutes.

Flight carbon footprint between Magway Airport (MWQ) and Beijing Nanyuan Airport (NAY)

On average, flying from Magway to Beijing generates about 205 kg of CO2 per passenger, and 205 kilograms equals 451 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Magway to Beijing

See the map of the shortest flight path between Magway Airport (MWQ) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E