Air Miles Calculator logo

How far is Huangping from Jabalpur?

The distance between Jabalpur (Jabalpur Airport) and Huangping (Kaili Airport) is 1767 miles / 2844 kilometers / 1535 nautical miles.

The driving distance from Jabalpur (JLR) to Huangping (KJH) is 2494 miles / 4013 kilometers, and travel time by car is about 50 hours 16 minutes.

Jabalpur Airport – Kaili Airport

Distance arrow
1767
Miles
Distance arrow
2844
Kilometers
Distance arrow
1535
Nautical miles
Flight time duration
3 h 50 min
Time Difference
2 h 30 min
CO2 emission
198 kg

Search flights

Distance from Jabalpur to Huangping

There are several ways to calculate the distance from Jabalpur to Huangping. Here are two standard methods:

Vincenty's formula (applied above)
  • 1766.988 miles
  • 2843.691 kilometers
  • 1535.471 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1764.165 miles
  • 2839.148 kilometers
  • 1533.017 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jabalpur to Huangping?

The estimated flight time from Jabalpur Airport to Kaili Airport is 3 hours and 50 minutes.

Flight carbon footprint between Jabalpur Airport (JLR) and Kaili Airport (KJH)

On average, flying from Jabalpur to Huangping generates about 198 kg of CO2 per passenger, and 198 kilograms equals 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jabalpur to Huangping

See the map of the shortest flight path between Jabalpur Airport (JLR) and Kaili Airport (KJH).

Airport information

Origin Jabalpur Airport
City: Jabalpur
Country: India Flag of India
IATA Code: JLR
ICAO Code: VAJB
Coordinates: 23°10′40″N, 80°3′7″E
Destination Kaili Airport
City: Huangping
Country: China Flag of China
IATA Code: KJH
ICAO Code: ZUKJ
Coordinates: 26°58′19″N, 107°59′16″E