How far is Naxos from San Martin DeLos Andes?
The distance between San Martin DeLos Andes (Aviador Carlos Campos Airport) and Naxos (Naxos Island National Airport) is 8091 miles / 13022 kilometers / 7031 nautical miles.
Aviador Carlos Campos Airport – Naxos Island National Airport
Search flights
Distance from San Martin DeLos Andes to Naxos
There are several ways to calculate the distance from San Martin DeLos Andes to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 8091.218 miles
- 13021.553 kilometers
- 7031.076 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8098.810 miles
- 13033.771 kilometers
- 7037.674 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from San Martin DeLos Andes to Naxos?
The estimated flight time from Aviador Carlos Campos Airport to Naxos Island National Airport is 15 hours and 49 minutes.
What is the time difference between San Martin DeLos Andes and Naxos?
Flight carbon footprint between Aviador Carlos Campos Airport (CPC) and Naxos Island National Airport (JNX)
On average, flying from San Martin DeLos Andes to Naxos generates about 1 012 kg of CO2 per passenger, and 1 012 kilograms equals 2 231 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from San Martin DeLos Andes to Naxos
See the map of the shortest flight path between Aviador Carlos Campos Airport (CPC) and Naxos Island National Airport (JNX).
Airport information
Origin | Aviador Carlos Campos Airport |
---|---|
City: | San Martin DeLos Andes |
Country: | Argentina |
IATA Code: | CPC |
ICAO Code: | SAZY |
Coordinates: | 40°4′31″S, 71°8′14″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |