Air Miles Calculator logo

How far is Huaihua from Jammu?

The distance between Jammu (Jammu Airport) and Huaihua (Huaihua Zhijiang Airport) is 2110 miles / 3396 kilometers / 1834 nautical miles.

The driving distance from Jammu (IXJ) to Huaihua (HJJ) is 3122 miles / 5025 kilometers, and travel time by car is about 61 hours 58 minutes.

Jammu Airport – Huaihua Zhijiang Airport

Distance arrow
2110
Miles
Distance arrow
3396
Kilometers
Distance arrow
1834
Nautical miles
Flight time duration
4 h 29 min
Time Difference
2 h 30 min
CO2 emission
230 kg

Search flights

Distance from Jammu to Huaihua

There are several ways to calculate the distance from Jammu to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 2110.184 miles
  • 3396.012 kilometers
  • 1833.700 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2106.368 miles
  • 3389.871 kilometers
  • 1830.384 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jammu to Huaihua?

The estimated flight time from Jammu Airport to Huaihua Zhijiang Airport is 4 hours and 29 minutes.

Flight carbon footprint between Jammu Airport (IXJ) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Jammu to Huaihua generates about 230 kg of CO2 per passenger, and 230 kilograms equals 507 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jammu to Huaihua

See the map of the shortest flight path between Jammu Airport (IXJ) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Jammu Airport
City: Jammu
Country: India Flag of India
IATA Code: IXJ
ICAO Code: VIJU
Coordinates: 32°41′20″N, 74°50′14″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E