Air Miles Calculator logo

How far is Sørkjosen from Astypalaia Island?

The distance between Astypalaia Island (Astypalaia Island National Airport) and Sørkjosen (Sørkjosen Airport) is 2305 miles / 3710 kilometers / 2003 nautical miles.

The driving distance from Astypalaia Island (JTY) to Sørkjosen (SOJ) is 3451 miles / 5554 kilometers, and travel time by car is about 86 hours 46 minutes.

Astypalaia Island National Airport – Sørkjosen Airport

Distance arrow
2305
Miles
Distance arrow
3710
Kilometers
Distance arrow
2003
Nautical miles

Search flights

Distance from Astypalaia Island to Sørkjosen

There are several ways to calculate the distance from Astypalaia Island to Sørkjosen. Here are two standard methods:

Vincenty's formula (applied above)
  • 2305.144 miles
  • 3709.770 kilometers
  • 2003.116 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2303.318 miles
  • 3706.830 kilometers
  • 2001.528 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Astypalaia Island to Sørkjosen?

The estimated flight time from Astypalaia Island National Airport to Sørkjosen Airport is 4 hours and 51 minutes.

Flight carbon footprint between Astypalaia Island National Airport (JTY) and Sørkjosen Airport (SOJ)

On average, flying from Astypalaia Island to Sørkjosen generates about 253 kg of CO2 per passenger, and 253 kilograms equals 557 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Astypalaia Island to Sørkjosen

See the map of the shortest flight path between Astypalaia Island National Airport (JTY) and Sørkjosen Airport (SOJ).

Airport information

Origin Astypalaia Island National Airport
City: Astypalaia Island
Country: Greece Flag of Greece
IATA Code: JTY
ICAO Code: LGPL
Coordinates: 36°34′47″N, 26°22′32″E
Destination Sørkjosen Airport
City: Sørkjosen
Country: Norway Flag of Norway
IATA Code: SOJ
ICAO Code: ENSR
Coordinates: 69°47′12″N, 20°57′33″E