Air Miles Calculator logo

How far is Astypalaia Island from Sabetta?

The distance between Sabetta (Sabetta International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 2896 miles / 4661 kilometers / 2517 nautical miles.

Sabetta International Airport – Astypalaia Island National Airport

Distance arrow
2896
Miles
Distance arrow
4661
Kilometers
Distance arrow
2517
Nautical miles

Search flights

Distance from Sabetta to Astypalaia Island

There are several ways to calculate the distance from Sabetta to Astypalaia Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 2896.412 miles
  • 4661.323 kilometers
  • 2516.913 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2891.801 miles
  • 4653.903 kilometers
  • 2512.907 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sabetta to Astypalaia Island?

The estimated flight time from Sabetta International Airport to Astypalaia Island National Airport is 5 hours and 59 minutes.

Flight carbon footprint between Sabetta International Airport (SBT) and Astypalaia Island National Airport (JTY)

On average, flying from Sabetta to Astypalaia Island generates about 322 kg of CO2 per passenger, and 322 kilograms equals 710 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sabetta to Astypalaia Island

See the map of the shortest flight path between Sabetta International Airport (SBT) and Astypalaia Island National Airport (JTY).

Airport information

Origin Sabetta International Airport
City: Sabetta
Country: Russia Flag of Russia
IATA Code: SBT
ICAO Code: USDA
Coordinates: 71°13′9″N, 72°3′7″E
Destination Astypalaia Island National Airport
City: Astypalaia Island
Country: Greece Flag of Greece
IATA Code: JTY
ICAO Code: LGPL
Coordinates: 36°34′47″N, 26°22′32″E