Air Miles Calculator logo

How far is Magway from Ningbo?

The distance between Ningbo (Ningbo Lishe International Airport) and Magway (Magway Airport) is 1786 miles / 2874 kilometers / 1552 nautical miles.

The driving distance from Ningbo (NGB) to Magway (MWQ) is 2285 miles / 3677 kilometers, and travel time by car is about 42 hours 38 minutes.

Ningbo Lishe International Airport – Magway Airport

Distance arrow
1786
Miles
Distance arrow
2874
Kilometers
Distance arrow
1552
Nautical miles
Flight time duration
3 h 52 min
Time Difference
1 h 30 min
CO2 emission
199 kg

Search flights

Distance from Ningbo to Magway

There are several ways to calculate the distance from Ningbo to Magway. Here are two standard methods:

Vincenty's formula (applied above)
  • 1786.058 miles
  • 2874.382 kilometers
  • 1552.042 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1784.360 miles
  • 2871.649 kilometers
  • 1550.566 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ningbo to Magway?

The estimated flight time from Ningbo Lishe International Airport to Magway Airport is 3 hours and 52 minutes.

Flight carbon footprint between Ningbo Lishe International Airport (NGB) and Magway Airport (MWQ)

On average, flying from Ningbo to Magway generates about 199 kg of CO2 per passenger, and 199 kilograms equals 439 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ningbo to Magway

See the map of the shortest flight path between Ningbo Lishe International Airport (NGB) and Magway Airport (MWQ).

Airport information

Origin Ningbo Lishe International Airport
City: Ningbo
Country: China Flag of China
IATA Code: NGB
ICAO Code: ZSNB
Coordinates: 29°49′36″N, 121°27′43″E
Destination Magway Airport
City: Magway
Country: Burma Flag of Burma
IATA Code: MWQ
ICAO Code: VYMW
Coordinates: 20°9′56″N, 94°56′29″E