Air Miles Calculator logo

How far is Huaihua from Hkamti?

The distance between Hkamti (Khamti Airport) and Huaihua (Huaihua Zhijiang Airport) is 872 miles / 1404 kilometers / 758 nautical miles.

The driving distance from Hkamti (KHM) to Huaihua (HJJ) is 1250 miles / 2012 kilometers, and travel time by car is about 25 hours 34 minutes.

Khamti Airport – Huaihua Zhijiang Airport

Distance arrow
872
Miles
Distance arrow
1404
Kilometers
Distance arrow
758
Nautical miles
Flight time duration
2 h 9 min
Time Difference
1 h 30 min
CO2 emission
141 kg

Search flights

Distance from Hkamti to Huaihua

There are several ways to calculate the distance from Hkamti to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 872.461 miles
  • 1404.089 kilometers
  • 758.148 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 870.958 miles
  • 1401.671 kilometers
  • 756.842 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hkamti to Huaihua?

The estimated flight time from Khamti Airport to Huaihua Zhijiang Airport is 2 hours and 9 minutes.

Flight carbon footprint between Khamti Airport (KHM) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Hkamti to Huaihua generates about 141 kg of CO2 per passenger, and 141 kilograms equals 311 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hkamti to Huaihua

See the map of the shortest flight path between Khamti Airport (KHM) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Khamti Airport
City: Hkamti
Country: Burma Flag of Burma
IATA Code: KHM
ICAO Code: VYKI
Coordinates: 25°59′17″N, 95°40′27″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E