Air Miles Calculator logo

How far is Ji'an from Bhopal?

The distance between Bhopal (Raja Bhoj Airport) and Ji'an (Jinggangshan Airport) is 2349 miles / 3781 kilometers / 2041 nautical miles.

The driving distance from Bhopal (BHO) to Ji'an (JGS) is 3163 miles / 5090 kilometers, and travel time by car is about 62 hours 8 minutes.

Raja Bhoj Airport – Jinggangshan Airport

Distance arrow
2349
Miles
Distance arrow
3781
Kilometers
Distance arrow
2041
Nautical miles
Flight time duration
4 h 56 min
Time Difference
2 h 30 min
CO2 emission
258 kg

Search flights

Distance from Bhopal to Ji'an

There are several ways to calculate the distance from Bhopal to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 2349.119 miles
  • 3780.541 kilometers
  • 2041.329 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2345.226 miles
  • 3774.276 kilometers
  • 2037.946 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhopal to Ji'an?

The estimated flight time from Raja Bhoj Airport to Jinggangshan Airport is 4 hours and 56 minutes.

Flight carbon footprint between Raja Bhoj Airport (BHO) and Jinggangshan Airport (JGS)

On average, flying from Bhopal to Ji'an generates about 258 kg of CO2 per passenger, and 258 kilograms equals 568 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhopal to Ji'an

See the map of the shortest flight path between Raja Bhoj Airport (BHO) and Jinggangshan Airport (JGS).

Airport information

Origin Raja Bhoj Airport
City: Bhopal
Country: India Flag of India
IATA Code: BHO
ICAO Code: VABP
Coordinates: 23°17′15″N, 77°20′14″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E