Air Miles Calculator logo

How far is Belgrad from Curitiba?

The distance between Curitiba (Curitiba Afonso Pena International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 6523 miles / 10497 kilometers / 5668 nautical miles.

Curitiba Afonso Pena International Airport – Belgrade Nikola Tesla Airport

Distance arrow
6523
Miles
Distance arrow
10497
Kilometers
Distance arrow
5668
Nautical miles

Search flights

Distance from Curitiba to Belgrad

There are several ways to calculate the distance from Curitiba to Belgrad. Here are two standard methods:

Vincenty's formula (applied above)
  • 6522.526 miles
  • 10496.988 kilometers
  • 5667.920 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6533.241 miles
  • 10514.232 kilometers
  • 5677.231 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Curitiba to Belgrad?

The estimated flight time from Curitiba Afonso Pena International Airport to Belgrade Nikola Tesla Airport is 12 hours and 50 minutes.

Flight carbon footprint between Curitiba Afonso Pena International Airport (CWB) and Belgrade Nikola Tesla Airport (BEG)

On average, flying from Curitiba to Belgrad generates about 788 kg of CO2 per passenger, and 788 kilograms equals 1 738 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Curitiba to Belgrad

See the map of the shortest flight path between Curitiba Afonso Pena International Airport (CWB) and Belgrade Nikola Tesla Airport (BEG).

Airport information

Origin Curitiba Afonso Pena International Airport
City: Curitiba
Country: Brazil Flag of Brazil
IATA Code: CWB
ICAO Code: SBCT
Coordinates: 25°31′42″S, 49°10′32″W
Destination Belgrade Nikola Tesla Airport
City: Belgrad
Country: Serbia Flag of Serbia
IATA Code: BEG
ICAO Code: LYBE
Coordinates: 44°49′6″N, 20°18′32″E