Air Miles Calculator logo

How far is Bijie from Bhopal?

The distance between Bhopal (Raja Bhoj Airport) and Bijie (Bijie Feixiong Airport) is 1778 miles / 2862 kilometers / 1545 nautical miles.

The driving distance from Bhopal (BHO) to Bijie (BFJ) is 2543 miles / 4093 kilometers, and travel time by car is about 51 hours 8 minutes.

Raja Bhoj Airport – Bijie Feixiong Airport

Distance arrow
1778
Miles
Distance arrow
2862
Kilometers
Distance arrow
1545
Nautical miles
Flight time duration
3 h 52 min
Time Difference
2 h 30 min
CO2 emission
198 kg

Search flights

Distance from Bhopal to Bijie

There are several ways to calculate the distance from Bhopal to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 1778.180 miles
  • 2861.703 kilometers
  • 1545.196 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1775.341 miles
  • 2857.134 kilometers
  • 1542.729 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhopal to Bijie?

The estimated flight time from Raja Bhoj Airport to Bijie Feixiong Airport is 3 hours and 52 minutes.

Flight carbon footprint between Raja Bhoj Airport (BHO) and Bijie Feixiong Airport (BFJ)

On average, flying from Bhopal to Bijie generates about 198 kg of CO2 per passenger, and 198 kilograms equals 437 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhopal to Bijie

See the map of the shortest flight path between Raja Bhoj Airport (BHO) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Raja Bhoj Airport
City: Bhopal
Country: India Flag of India
IATA Code: BHO
ICAO Code: VABP
Coordinates: 23°17′15″N, 77°20′14″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E