Air Miles Calculator logo

How far is Naxos from Sandnessjoen?

The distance between Sandnessjoen (Sandnessjøen Airport, Stokka) and Naxos (Naxos Island National Airport) is 2063 miles / 3320 kilometers / 1792 nautical miles.

The driving distance from Sandnessjoen (SSJ) to Naxos (JNX) is 2881 miles / 4637 kilometers, and travel time by car is about 71 hours 22 minutes.

Sandnessjøen Airport, Stokka – Naxos Island National Airport

Distance arrow
2063
Miles
Distance arrow
3320
Kilometers
Distance arrow
1792
Nautical miles

Search flights

Distance from Sandnessjoen to Naxos

There are several ways to calculate the distance from Sandnessjoen to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 2062.770 miles
  • 3319.706 kilometers
  • 1792.498 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2061.361 miles
  • 3317.439 kilometers
  • 1791.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sandnessjoen to Naxos?

The estimated flight time from Sandnessjøen Airport, Stokka to Naxos Island National Airport is 4 hours and 24 minutes.

Flight carbon footprint between Sandnessjøen Airport, Stokka (SSJ) and Naxos Island National Airport (JNX)

On average, flying from Sandnessjoen to Naxos generates about 225 kg of CO2 per passenger, and 225 kilograms equals 495 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sandnessjoen to Naxos

See the map of the shortest flight path between Sandnessjøen Airport, Stokka (SSJ) and Naxos Island National Airport (JNX).

Airport information

Origin Sandnessjøen Airport, Stokka
City: Sandnessjoen
Country: Norway Flag of Norway
IATA Code: SSJ
ICAO Code: ENST
Coordinates: 65°57′24″N, 12°28′8″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E