Air Miles Calculator logo

How far is Belgrad from Krasnoyarsk?

The distance between Krasnoyarsk (Krasnoyarsk International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 3122 miles / 5025 kilometers / 2713 nautical miles.

Krasnoyarsk International Airport – Belgrade Nikola Tesla Airport

Distance arrow
3122
Miles
Distance arrow
5025
Kilometers
Distance arrow
2713
Nautical miles

Search flights

Distance from Krasnoyarsk to Belgrad

There are several ways to calculate the distance from Krasnoyarsk to Belgrad. Here are two standard methods:

Vincenty's formula (applied above)
  • 3122.479 miles
  • 5025.144 kilometers
  • 2713.360 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3113.373 miles
  • 5010.488 kilometers
  • 2705.447 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Krasnoyarsk to Belgrad?

The estimated flight time from Krasnoyarsk International Airport to Belgrade Nikola Tesla Airport is 6 hours and 24 minutes.

Flight carbon footprint between Krasnoyarsk International Airport (KJA) and Belgrade Nikola Tesla Airport (BEG)

On average, flying from Krasnoyarsk to Belgrad generates about 349 kg of CO2 per passenger, and 349 kilograms equals 769 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Krasnoyarsk to Belgrad

See the map of the shortest flight path between Krasnoyarsk International Airport (KJA) and Belgrade Nikola Tesla Airport (BEG).

Airport information

Origin Krasnoyarsk International Airport
City: Krasnoyarsk
Country: Russia Flag of Russia
IATA Code: KJA
ICAO Code: UNKL
Coordinates: 56°10′22″N, 92°29′35″E
Destination Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E