Air Miles Calculator logo

How far is Burqin from Bogorodskoye?

The distance between Bogorodskoye (Bogorodskoye Airport) and Burqin (Burqin Kanas Airport) is 2329 miles / 3748 kilometers / 2024 nautical miles.

The driving distance from Bogorodskoye (BQG) to Burqin (KJI) is 3760 miles / 6051 kilometers, and travel time by car is about 74 hours 25 minutes.

Bogorodskoye Airport – Burqin Kanas Airport

Distance arrow
2329
Miles
Distance arrow
3748
Kilometers
Distance arrow
2024
Nautical miles

Search flights

Distance from Bogorodskoye to Burqin

There are several ways to calculate the distance from Bogorodskoye to Burqin. Here are two standard methods:

Vincenty's formula (applied above)
  • 2329.084 miles
  • 3748.298 kilometers
  • 2023.919 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2321.973 miles
  • 3736.854 kilometers
  • 2017.740 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bogorodskoye to Burqin?

The estimated flight time from Bogorodskoye Airport to Burqin Kanas Airport is 4 hours and 54 minutes.

Flight carbon footprint between Bogorodskoye Airport (BQG) and Burqin Kanas Airport (KJI)

On average, flying from Bogorodskoye to Burqin generates about 255 kg of CO2 per passenger, and 255 kilograms equals 563 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bogorodskoye to Burqin

See the map of the shortest flight path between Bogorodskoye Airport (BQG) and Burqin Kanas Airport (KJI).

Airport information

Origin Bogorodskoye Airport
City: Bogorodskoye
Country: Russia Flag of Russia
IATA Code: BQG
ICAO Code: UHNB
Coordinates: 52°22′48″N, 140°26′52″E
Destination Burqin Kanas Airport
City: Burqin
Country: China Flag of China
IATA Code: KJI
ICAO Code: ZWKN
Coordinates: 48°13′20″N, 86°59′45″E