Air Miles Calculator logo

How far is Hechi from Jammu?

The distance between Jammu (Jammu Airport) and Hechi (Hechi Jinchengjiang Airport) is 2058 miles / 3312 kilometers / 1788 nautical miles.

The driving distance from Jammu (IXJ) to Hechi (HCJ) is 3015 miles / 4852 kilometers, and travel time by car is about 59 hours 57 minutes.

Jammu Airport – Hechi Jinchengjiang Airport

Distance arrow
2058
Miles
Distance arrow
3312
Kilometers
Distance arrow
1788
Nautical miles
Flight time duration
4 h 23 min
Time Difference
2 h 30 min
CO2 emission
224 kg

Search flights

Distance from Jammu to Hechi

There are several ways to calculate the distance from Jammu to Hechi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2057.909 miles
  • 3311.884 kilometers
  • 1788.274 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2054.759 miles
  • 3306.814 kilometers
  • 1785.537 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jammu to Hechi?

The estimated flight time from Jammu Airport to Hechi Jinchengjiang Airport is 4 hours and 23 minutes.

Flight carbon footprint between Jammu Airport (IXJ) and Hechi Jinchengjiang Airport (HCJ)

On average, flying from Jammu to Hechi generates about 224 kg of CO2 per passenger, and 224 kilograms equals 494 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jammu to Hechi

See the map of the shortest flight path between Jammu Airport (IXJ) and Hechi Jinchengjiang Airport (HCJ).

Airport information

Origin Jammu Airport
City: Jammu
Country: India Flag of India
IATA Code: IXJ
ICAO Code: VIJU
Coordinates: 32°41′20″N, 74°50′14″E
Destination Hechi Jinchengjiang Airport
City: Hechi
Country: China Flag of China
IATA Code: HCJ
ICAO Code: ZGHC
Coordinates: 24°48′18″N, 107°41′58″E