Air Miles Calculator logo

How far is Bazhong from Magway?

The distance between Magway (Magway Airport) and Bazhong (Bazhong Enyang Airport) is 1078 miles / 1735 kilometers / 937 nautical miles.

The driving distance from Magway (MWQ) to Bazhong (BZX) is 1483 miles / 2387 kilometers, and travel time by car is about 28 hours 35 minutes.

Magway Airport – Bazhong Enyang Airport

Distance arrow
1078
Miles
Distance arrow
1735
Kilometers
Distance arrow
937
Nautical miles
Flight time duration
2 h 32 min
Time Difference
1 h 30 min
CO2 emission
156 kg

Search flights

Distance from Magway to Bazhong

There are several ways to calculate the distance from Magway to Bazhong. Here are two standard methods:

Vincenty's formula (applied above)
  • 1077.866 miles
  • 1734.657 kilometers
  • 936.640 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1079.164 miles
  • 1736.746 kilometers
  • 937.768 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Magway to Bazhong?

The estimated flight time from Magway Airport to Bazhong Enyang Airport is 2 hours and 32 minutes.

Flight carbon footprint between Magway Airport (MWQ) and Bazhong Enyang Airport (BZX)

On average, flying from Magway to Bazhong generates about 156 kg of CO2 per passenger, and 156 kilograms equals 343 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Magway to Bazhong

See the map of the shortest flight path between Magway Airport (MWQ) and Bazhong Enyang Airport (BZX).

Airport information

Origin Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E
Destination Bazhong Enyang Airport
City: Bazhong
Country: China Flag of China
IATA Code: BZX
ICAO Code: ZUBZ
Coordinates: 31°44′18″N, 106°38′41″E