Air Miles Calculator logo

How far is Astypalaia Island from Ust-Kuyga?

The distance between Ust-Kuyga (Ust-Kuyga Airport) and Astypalaia Island (Astypalaia Island National Airport) is 4294 miles / 6911 kilometers / 3732 nautical miles.

Ust-Kuyga Airport – Astypalaia Island National Airport

Distance arrow
4294
Miles
Distance arrow
6911
Kilometers
Distance arrow
3732
Nautical miles

Search flights

Distance from Ust-Kuyga to Astypalaia Island

There are several ways to calculate the distance from Ust-Kuyga to Astypalaia Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 4294.262 miles
  • 6910.945 kilometers
  • 3731.612 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4283.873 miles
  • 6894.225 kilometers
  • 3722.584 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ust-Kuyga to Astypalaia Island?

The estimated flight time from Ust-Kuyga Airport to Astypalaia Island National Airport is 8 hours and 37 minutes.

Flight carbon footprint between Ust-Kuyga Airport (UKG) and Astypalaia Island National Airport (JTY)

On average, flying from Ust-Kuyga to Astypalaia Island generates about 493 kg of CO2 per passenger, and 493 kilograms equals 1 088 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ust-Kuyga to Astypalaia Island

See the map of the shortest flight path between Ust-Kuyga Airport (UKG) and Astypalaia Island National Airport (JTY).

Airport information

Origin Ust-Kuyga Airport
City: Ust-Kuyga
Country: Russia Flag of Russia
IATA Code: UKG
ICAO Code: UEBT
Coordinates: 70°0′39″N, 135°38′42″E
Destination Astypalaia Island National Airport
City: Astypalaia Island
Country: Greece Flag of Greece
IATA Code: JTY
ICAO Code: LGPL
Coordinates: 36°34′47″N, 26°22′32″E