Air Miles Calculator logo

How far is Bijie from Abakan?

The distance between Abakan (Abakan International Airport) and Bijie (Bijie Feixiong Airport) is 1963 miles / 3160 kilometers / 1706 nautical miles.

The driving distance from Abakan (ABA) to Bijie (BFJ) is 2659 miles / 4280 kilometers, and travel time by car is about 60 hours 26 minutes.

Abakan International Airport – Bijie Feixiong Airport

Distance arrow
1963
Miles
Distance arrow
3160
Kilometers
Distance arrow
1706
Nautical miles

Search flights

Distance from Abakan to Bijie

There are several ways to calculate the distance from Abakan to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 1963.260 miles
  • 3159.561 kilometers
  • 1706.026 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1964.864 miles
  • 3162.142 kilometers
  • 1707.420 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abakan to Bijie?

The estimated flight time from Abakan International Airport to Bijie Feixiong Airport is 4 hours and 13 minutes.

Flight carbon footprint between Abakan International Airport (ABA) and Bijie Feixiong Airport (BFJ)

On average, flying from Abakan to Bijie generates about 214 kg of CO2 per passenger, and 214 kilograms equals 472 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Abakan to Bijie

See the map of the shortest flight path between Abakan International Airport (ABA) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Abakan International Airport
City: Abakan
Country: Russia Flag of Russia
IATA Code: ABA
ICAO Code: UNAA
Coordinates: 53°44′24″N, 91°23′6″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E