How far is Belgrad from San Martin DeLos Andes?
The distance between San Martin DeLos Andes (Aviador Carlos Campos Airport) and Belgrad (Belgrade Nikola Tesla Airport) is 8134 miles / 13090 kilometers / 7068 nautical miles.
Aviador Carlos Campos Airport – Belgrade Nikola Tesla Airport
Search flights
Distance from San Martin DeLos Andes to Belgrad
There are several ways to calculate the distance from San Martin DeLos Andes to Belgrad. Here are two standard methods:
Vincenty's formula (applied above)- 8133.908 miles
- 13090.256 kilometers
- 7068.173 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8144.134 miles
- 13106.713 kilometers
- 7077.059 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from San Martin DeLos Andes to Belgrad?
The estimated flight time from Aviador Carlos Campos Airport to Belgrade Nikola Tesla Airport is 15 hours and 54 minutes.
What is the time difference between San Martin DeLos Andes and Belgrad?
Flight carbon footprint between Aviador Carlos Campos Airport (CPC) and Belgrade Nikola Tesla Airport (BEG)
On average, flying from San Martin DeLos Andes to Belgrad generates about 1 018 kg of CO2 per passenger, and 1 018 kilograms equals 2 244 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from San Martin DeLos Andes to Belgrad
See the map of the shortest flight path between Aviador Carlos Campos Airport (CPC) and Belgrade Nikola Tesla Airport (BEG).
Airport information
Origin | Aviador Carlos Campos Airport |
---|---|
City: | San Martin DeLos Andes |
Country: | Argentina |
IATA Code: | CPC |
ICAO Code: | SAZY |
Coordinates: | 40°4′31″S, 71°8′14″W |
Destination | Belgrade Nikola Tesla Airport |
---|---|
City: | Belgrad |
Country: | Serbia |
IATA Code: | BEG |
ICAO Code: | LYBE |
Coordinates: | 44°49′6″N, 20°18′32″E |