Air Miles Calculator logo

How far is Hechi from Mexico City?

The distance between Mexico City (Mexico City International Airport) and Hechi (Hechi Jinchengjiang Airport) is 8900 miles / 14323 kilometers / 7734 nautical miles.

Mexico City International Airport – Hechi Jinchengjiang Airport

Distance arrow
8900
Miles
Distance arrow
14323
Kilometers
Distance arrow
7734
Nautical miles
Flight time duration
17 h 21 min
CO2 emission
1 132 kg

Search flights

Distance from Mexico City to Hechi

There are several ways to calculate the distance from Mexico City to Hechi. Here are two standard methods:

Vincenty's formula (applied above)
  • 8899.660 miles
  • 14322.614 kilometers
  • 7733.593 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8889.318 miles
  • 14305.971 kilometers
  • 7724.606 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mexico City to Hechi?

The estimated flight time from Mexico City International Airport to Hechi Jinchengjiang Airport is 17 hours and 21 minutes.

Flight carbon footprint between Mexico City International Airport (MEX) and Hechi Jinchengjiang Airport (HCJ)

On average, flying from Mexico City to Hechi generates about 1 132 kg of CO2 per passenger, and 1 132 kilograms equals 2 496 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Mexico City to Hechi

See the map of the shortest flight path between Mexico City International Airport (MEX) and Hechi Jinchengjiang Airport (HCJ).

Airport information

Origin Mexico City International Airport
City: Mexico City
Country: Mexico Flag of Mexico
IATA Code: MEX
ICAO Code: MMMX
Coordinates: 19°26′10″N, 99°4′19″W
Destination Hechi Jinchengjiang Airport
City: Hechi
Country: China Flag of China
IATA Code: HCJ
ICAO Code: ZGHC
Coordinates: 24°48′18″N, 107°41′58″E