How far is Naxos from St John's?
The distance between St John's (V. C. Bird International Airport) and Naxos (Naxos Island National Airport) is 5367 miles / 8637 kilometers / 4663 nautical miles.
V. C. Bird International Airport – Naxos Island National Airport
Search flights
Distance from St John's to Naxos
There are several ways to calculate the distance from St John's to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5366.612 miles
- 8636.725 kilometers
- 4663.458 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5358.868 miles
- 8624.262 kilometers
- 4656.729 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from St John's to Naxos?
The estimated flight time from V. C. Bird International Airport to Naxos Island National Airport is 10 hours and 39 minutes.
What is the time difference between St John's and Naxos?
The time difference between St John's and Naxos is 6 hours. Naxos is 6 hours ahead of St John's.
Flight carbon footprint between V. C. Bird International Airport (ANU) and Naxos Island National Airport (JNX)
On average, flying from St John's to Naxos generates about 632 kg of CO2 per passenger, and 632 kilograms equals 1 393 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from St John's to Naxos
See the map of the shortest flight path between V. C. Bird International Airport (ANU) and Naxos Island National Airport (JNX).
Airport information
Origin | V. C. Bird International Airport |
---|---|
City: | St John's |
Country: | Antigua and Barbuda |
IATA Code: | ANU |
ICAO Code: | TAPA |
Coordinates: | 17°8′12″N, 61°47′33″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |