Air Miles Calculator logo

How far is Huaihua from Myeik?

The distance between Myeik (Myeik Airport) and Huaihua (Huaihua Zhijiang Airport) is 1257 miles / 2023 kilometers / 1092 nautical miles.

The driving distance from Myeik (MGZ) to Huaihua (HJJ) is 1807 miles / 2908 kilometers, and travel time by car is about 35 hours 11 minutes.

Myeik Airport – Huaihua Zhijiang Airport

Distance arrow
1257
Miles
Distance arrow
2023
Kilometers
Distance arrow
1092
Nautical miles
Flight time duration
2 h 52 min
Time Difference
1 h 30 min
CO2 emission
164 kg

Search flights

Distance from Myeik to Huaihua

There are several ways to calculate the distance from Myeik to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 1256.954 miles
  • 2022.872 kilometers
  • 1092.263 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1260.062 miles
  • 2027.873 kilometers
  • 1094.964 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Myeik to Huaihua?

The estimated flight time from Myeik Airport to Huaihua Zhijiang Airport is 2 hours and 52 minutes.

Flight carbon footprint between Myeik Airport (MGZ) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Myeik to Huaihua generates about 164 kg of CO2 per passenger, and 164 kilograms equals 362 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Myeik to Huaihua

See the map of the shortest flight path between Myeik Airport (MGZ) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Myeik Airport
City: Myeik
Country: Burma Flag of Burma
IATA Code: MGZ
ICAO Code: VYME
Coordinates: 12°26′23″N, 98°37′17″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E