How far is Belgrad from Cuneo?
The distance between Cuneo (Cuneo International Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 625 miles / 1005 kilometers / 543 nautical miles.
The driving distance from Cuneo (CUF) to Belgrad (BEG) is 739 miles / 1190 kilometers, and travel time by car is about 12 hours 17 minutes.
Cuneo International Airport – Belgrade Nikola Tesla Airport
Search flights
Distance from Cuneo to Belgrad
There are several ways to calculate the distance from Cuneo to Belgrad. Here are two standard methods:
Vincenty's formula (applied above)- 624.588 miles
- 1005.177 kilometers
- 542.752 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 622.859 miles
- 1002.394 kilometers
- 541.249 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Cuneo to Belgrad?
The estimated flight time from Cuneo International Airport to Belgrade Nikola Tesla Airport is 1 hour and 40 minutes.
What is the time difference between Cuneo and Belgrad?
Flight carbon footprint between Cuneo International Airport (CUF) and Belgrade Nikola Tesla Airport (BEG)
On average, flying from Cuneo to Belgrad generates about 116 kg of CO2 per passenger, and 116 kilograms equals 255 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Cuneo to Belgrad
See the map of the shortest flight path between Cuneo International Airport (CUF) and Belgrade Nikola Tesla Airport (BEG).
Airport information
Origin | Cuneo International Airport |
---|---|
City: | Cuneo |
Country: | Italy |
IATA Code: | CUF |
ICAO Code: | LIMZ |
Coordinates: | 44°32′49″N, 7°37′23″E |
Destination | Belgrade Nikola Tesla Airport |
---|---|
City: | Belgrad |
Country: | Serbia |
IATA Code: | BEG |
ICAO Code: | LYBE |
Coordinates: | 44°49′6″N, 20°18′32″E |