How far is Naxos from Tortolì?
The distance between Tortolì (Tortolì Airport) and Naxos (Naxos Island National Airport) is 871 miles / 1402 kilometers / 757 nautical miles.
The driving distance from Tortolì (TTB) to Naxos (JNX) is 1270 miles / 2044 kilometers, and travel time by car is about 50 hours 23 minutes.
Tortolì Airport – Naxos Island National Airport
Search flights
Distance from Tortolì to Naxos
There are several ways to calculate the distance from Tortolì to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 871.183 miles
- 1402.033 kilometers
- 757.037 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 869.261 miles
- 1398.941 kilometers
- 755.367 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tortolì to Naxos?
The estimated flight time from Tortolì Airport to Naxos Island National Airport is 2 hours and 8 minutes.
What is the time difference between Tortolì and Naxos?
The time difference between Tortolì and Naxos is 1 hour. Naxos is 1 hour ahead of Tortolì.
Flight carbon footprint between Tortolì Airport (TTB) and Naxos Island National Airport (JNX)
On average, flying from Tortolì to Naxos generates about 141 kg of CO2 per passenger, and 141 kilograms equals 311 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Tortolì to Naxos
See the map of the shortest flight path between Tortolì Airport (TTB) and Naxos Island National Airport (JNX).
Airport information
Origin | Tortolì Airport |
---|---|
City: | Tortolì |
Country: | Italy |
IATA Code: | TTB |
ICAO Code: | LIET |
Coordinates: | 39°55′7″N, 9°40′58″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |