How far is Naxos from San Jose?
The distance between San Jose (Juan Santamaría International Airport) and Naxos (Naxos Island National Airport) is 6857 miles / 11035 kilometers / 5959 nautical miles.
Juan Santamaría International Airport – Naxos Island National Airport
Search flights
Distance from San Jose to Naxos
There are several ways to calculate the distance from San Jose to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 6857.135 miles
- 11035.489 kilometers
- 5958.687 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6849.017 miles
- 11022.424 kilometers
- 5951.633 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from San Jose to Naxos?
The estimated flight time from Juan Santamaría International Airport to Naxos Island National Airport is 13 hours and 28 minutes.
What is the time difference between San Jose and Naxos?
The time difference between San Jose and Naxos is 8 hours. Naxos is 8 hours ahead of San Jose.
Flight carbon footprint between Juan Santamaría International Airport (SJO) and Naxos Island National Airport (JNX)
On average, flying from San Jose to Naxos generates about 835 kg of CO2 per passenger, and 835 kilograms equals 1 840 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from San Jose to Naxos
See the map of the shortest flight path between Juan Santamaría International Airport (SJO) and Naxos Island National Airport (JNX).
Airport information
Origin | Juan Santamaría International Airport |
---|---|
City: | San Jose |
Country: | Costa Rica |
IATA Code: | SJO |
ICAO Code: | MROC |
Coordinates: | 9°59′37″N, 84°12′31″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |