Air Miles Calculator logo

How far is Huaihua from Mexico City?

The distance between Mexico City (Mexico City International Airport) and Huaihua (Huaihua Zhijiang Airport) is 8680 miles / 13969 kilometers / 7542 nautical miles.

Mexico City International Airport – Huaihua Zhijiang Airport

Distance arrow
8680
Miles
Distance arrow
13969
Kilometers
Distance arrow
7542
Nautical miles
Flight time duration
16 h 56 min
CO2 emission
1 099 kg

Search flights

Distance from Mexico City to Huaihua

There are several ways to calculate the distance from Mexico City to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 8679.748 miles
  • 13968.700 kilometers
  • 7542.495 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8668.963 miles
  • 13951.343 kilometers
  • 7533.123 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mexico City to Huaihua?

The estimated flight time from Mexico City International Airport to Huaihua Zhijiang Airport is 16 hours and 56 minutes.

Flight carbon footprint between Mexico City International Airport (MEX) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Mexico City to Huaihua generates about 1 099 kg of CO2 per passenger, and 1 099 kilograms equals 2 423 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Mexico City to Huaihua

See the map of the shortest flight path between Mexico City International Airport (MEX) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Mexico City International Airport
City: Mexico City
Country: Mexico Flag of Mexico
IATA Code: MEX
ICAO Code: MMMX
Coordinates: 19°26′10″N, 99°4′19″W
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E