Air Miles Calculator logo

How far is Naxos from Santa Cruz De La Palma?

The distance between Santa Cruz De La Palma (La Palma Airport) and Naxos (Naxos Island National Airport) is 2552 miles / 4108 kilometers / 2218 nautical miles.

The driving distance from Santa Cruz De La Palma (SPC) to Naxos (JNX) is 3683 miles / 5928 kilometers, and travel time by car is about 85 hours 20 minutes.

La Palma Airport – Naxos Island National Airport

Distance arrow
2552
Miles
Distance arrow
4108
Kilometers
Distance arrow
2218
Nautical miles

Search flights

Distance from Santa Cruz De La Palma to Naxos

There are several ways to calculate the distance from Santa Cruz De La Palma to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 2552.344 miles
  • 4107.600 kilometers
  • 2217.926 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2547.617 miles
  • 4099.992 kilometers
  • 2213.819 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz De La Palma to Naxos?

The estimated flight time from La Palma Airport to Naxos Island National Airport is 5 hours and 19 minutes.

Flight carbon footprint between La Palma Airport (SPC) and Naxos Island National Airport (JNX)

On average, flying from Santa Cruz De La Palma to Naxos generates about 281 kg of CO2 per passenger, and 281 kilograms equals 620 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Cruz De La Palma to Naxos

See the map of the shortest flight path between La Palma Airport (SPC) and Naxos Island National Airport (JNX).

Airport information

Origin La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E