Air Miles Calculator logo

How far is Bijie from Jabalpur?

The distance between Jabalpur (Jabalpur Airport) and Bijie (Bijie Feixiong Airport) is 1613 miles / 2597 kilometers / 1402 nautical miles.

The driving distance from Jabalpur (JLR) to Bijie (BFJ) is 2355 miles / 3790 kilometers, and travel time by car is about 47 hours 59 minutes.

Jabalpur Airport – Bijie Feixiong Airport

Distance arrow
1613
Miles
Distance arrow
2597
Kilometers
Distance arrow
1402
Nautical miles
Flight time duration
3 h 33 min
Time Difference
2 h 30 min
CO2 emission
187 kg

Search flights

Distance from Jabalpur to Bijie

There are several ways to calculate the distance from Jabalpur to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 1613.425 miles
  • 2596.556 kilometers
  • 1402.028 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1610.913 miles
  • 2592.512 kilometers
  • 1399.845 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jabalpur to Bijie?

The estimated flight time from Jabalpur Airport to Bijie Feixiong Airport is 3 hours and 33 minutes.

Flight carbon footprint between Jabalpur Airport (JLR) and Bijie Feixiong Airport (BFJ)

On average, flying from Jabalpur to Bijie generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jabalpur to Bijie

See the map of the shortest flight path between Jabalpur Airport (JLR) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Jabalpur Airport
City: Jabalpur
Country: India Flag of India
IATA Code: JLR
ICAO Code: VAJB
Coordinates: 23°10′40″N, 80°3′7″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E