Air Miles Calculator logo

How far is Shanghai from Myeik?

The distance between Myeik (Myeik Airport) and Shanghai (Shanghai Hongqiao International Airport) is 1940 miles / 3122 kilometers / 1686 nautical miles.

The driving distance from Myeik (MGZ) to Shanghai (SHA) is 2543 miles / 4092 kilometers, and travel time by car is about 49 hours 6 minutes.

Myeik Airport – Shanghai Hongqiao International Airport

Distance arrow
1940
Miles
Distance arrow
3122
Kilometers
Distance arrow
1686
Nautical miles
Flight time duration
4 h 10 min
Time Difference
1 h 30 min
CO2 emission
212 kg

Search flights

Distance from Myeik to Shanghai

There are several ways to calculate the distance from Myeik to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1939.905 miles
  • 3121.975 kilometers
  • 1685.732 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1941.759 miles
  • 3124.958 kilometers
  • 1687.342 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Myeik to Shanghai?

The estimated flight time from Myeik Airport to Shanghai Hongqiao International Airport is 4 hours and 10 minutes.

Flight carbon footprint between Myeik Airport (MGZ) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Myeik to Shanghai generates about 212 kg of CO2 per passenger, and 212 kilograms equals 467 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Myeik to Shanghai

See the map of the shortest flight path between Myeik Airport (MGZ) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Myeik Airport
City: Myeik
Country: Burma Flag of Burma
IATA Code: MGZ
ICAO Code: VYME
Coordinates: 12°26′23″N, 98°37′17″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E