Air Miles Calculator logo

How far is Ji'an from Jabalpur?

The distance between Jabalpur (Jabalpur Airport) and Ji'an (Jinggangshan Airport) is 2183 miles / 3514 kilometers / 1897 nautical miles.

The driving distance from Jabalpur (JLR) to Ji'an (JGS) is 2975 miles / 4788 kilometers, and travel time by car is about 59 hours 0 minutes.

Jabalpur Airport – Jinggangshan Airport

Distance arrow
2183
Miles
Distance arrow
3514
Kilometers
Distance arrow
1897
Nautical miles
Flight time duration
4 h 38 min
Time Difference
2 h 30 min
CO2 emission
238 kg

Search flights

Distance from Jabalpur to Ji'an

There are several ways to calculate the distance from Jabalpur to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 2183.388 miles
  • 3513.823 kilometers
  • 1897.312 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2179.805 miles
  • 3508.056 kilometers
  • 1894.199 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jabalpur to Ji'an?

The estimated flight time from Jabalpur Airport to Jinggangshan Airport is 4 hours and 38 minutes.

Flight carbon footprint between Jabalpur Airport (JLR) and Jinggangshan Airport (JGS)

On average, flying from Jabalpur to Ji'an generates about 238 kg of CO2 per passenger, and 238 kilograms equals 526 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jabalpur to Ji'an

See the map of the shortest flight path between Jabalpur Airport (JLR) and Jinggangshan Airport (JGS).

Airport information

Origin Jabalpur Airport
City: Jabalpur
Country: India Flag of India
IATA Code: JLR
ICAO Code: VAJB
Coordinates: 23°10′40″N, 80°3′7″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E