Air Miles Calculator logo

How far is Anshan from Magway?

The distance between Magway (Magway Airport) and Anshan (Anshan Teng'ao Airport) is 2183 miles / 3514 kilometers / 1897 nautical miles.

The driving distance from Magway (MWQ) to Anshan (AOG) is 2778 miles / 4471 kilometers, and travel time by car is about 51 hours 59 minutes.

Magway Airport – Anshan Teng'ao Airport

Distance arrow
2183
Miles
Distance arrow
3514
Kilometers
Distance arrow
1897
Nautical miles
Flight time duration
4 h 38 min
Time Difference
1 h 30 min
CO2 emission
238 kg

Search flights

Distance from Magway to Anshan

There are several ways to calculate the distance from Magway to Anshan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2183.414 miles
  • 3513.864 kilometers
  • 1897.335 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2183.781 miles
  • 3514.454 kilometers
  • 1897.654 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Magway to Anshan?

The estimated flight time from Magway Airport to Anshan Teng'ao Airport is 4 hours and 38 minutes.

Flight carbon footprint between Magway Airport (MWQ) and Anshan Teng'ao Airport (AOG)

On average, flying from Magway to Anshan generates about 238 kg of CO2 per passenger, and 238 kilograms equals 526 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Magway to Anshan

See the map of the shortest flight path between Magway Airport (MWQ) and Anshan Teng'ao Airport (AOG).

Airport information

Origin Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E
Destination Anshan Teng'ao Airport
City: Anshan
Country: China Flag of China
IATA Code: AOG
ICAO Code: ZYAS
Coordinates: 41°6′19″N, 122°51′14″E