Air Miles Calculator logo

How far is Naxos from Lamezia Terme?

The distance between Lamezia Terme (Lamezia Terme International Airport) and Naxos (Naxos Island National Airport) is 513 miles / 826 kilometers / 446 nautical miles.

The driving distance from Lamezia Terme (SUF) to Naxos (JNX) is 854 miles / 1374 kilometers, and travel time by car is about 37 hours 16 minutes.

Lamezia Terme International Airport – Naxos Island National Airport

Distance arrow
513
Miles
Distance arrow
826
Kilometers
Distance arrow
446
Nautical miles

Search flights

Distance from Lamezia Terme to Naxos

There are several ways to calculate the distance from Lamezia Terme to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 513.495 miles
  • 826.389 kilometers
  • 446.215 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 512.399 miles
  • 824.626 kilometers
  • 445.262 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lamezia Terme to Naxos?

The estimated flight time from Lamezia Terme International Airport to Naxos Island National Airport is 1 hour and 28 minutes.

Flight carbon footprint between Lamezia Terme International Airport (SUF) and Naxos Island National Airport (JNX)

On average, flying from Lamezia Terme to Naxos generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lamezia Terme to Naxos

See the map of the shortest flight path between Lamezia Terme International Airport (SUF) and Naxos Island National Airport (JNX).

Airport information

Origin Lamezia Terme International Airport
City: Lamezia Terme
Country: Italy Flag of Italy
IATA Code: SUF
ICAO Code: LICA
Coordinates: 38°54′19″N, 16°14′32″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E