Air Miles Calculator logo

How far is Astypalaia Island from Surgut?

The distance between Surgut (Surgut International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 2647 miles / 4259 kilometers / 2300 nautical miles.

Surgut International Airport – Astypalaia Island National Airport

Distance arrow
2647
Miles
Distance arrow
4259
Kilometers
Distance arrow
2300
Nautical miles

Search flights

Distance from Surgut to Astypalaia Island

There are several ways to calculate the distance from Surgut to Astypalaia Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 2646.608 miles
  • 4259.303 kilometers
  • 2299.840 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2641.949 miles
  • 4251.805 kilometers
  • 2295.791 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Surgut to Astypalaia Island?

The estimated flight time from Surgut International Airport to Astypalaia Island National Airport is 5 hours and 30 minutes.

Flight carbon footprint between Surgut International Airport (SGC) and Astypalaia Island National Airport (JTY)

On average, flying from Surgut to Astypalaia Island generates about 292 kg of CO2 per passenger, and 292 kilograms equals 645 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Surgut to Astypalaia Island

See the map of the shortest flight path between Surgut International Airport (SGC) and Astypalaia Island National Airport (JTY).

Airport information

Origin Surgut International Airport
City: Surgut
Country: Russia Flag of Russia
IATA Code: SGC
ICAO Code: USRR
Coordinates: 61°20′37″N, 73°24′6″E
Destination Astypalaia Island National Airport
City: Astypalaia Island
Country: Greece Flag of Greece
IATA Code: JTY
ICAO Code: LGPL
Coordinates: 36°34′47″N, 26°22′32″E