Air Miles Calculator logo

How far is Astypalaia Island from Canberra?

The distance between Canberra (Canberra Airport) and Astypalaia Island (Astypalaia Island National Airport) is 9284 miles / 14941 kilometers / 8068 nautical miles.

Canberra Airport – Astypalaia Island National Airport

Distance arrow
9284
Miles
Distance arrow
14941
Kilometers
Distance arrow
8068
Nautical miles

Search flights

Distance from Canberra to Astypalaia Island

There are several ways to calculate the distance from Canberra to Astypalaia Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 9284.187 miles
  • 14941.451 kilometers
  • 8067.738 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9285.849 miles
  • 14944.126 kilometers
  • 8069.183 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Canberra to Astypalaia Island?

The estimated flight time from Canberra Airport to Astypalaia Island National Airport is 18 hours and 4 minutes.

Flight carbon footprint between Canberra Airport (CBR) and Astypalaia Island National Airport (JTY)

On average, flying from Canberra to Astypalaia Island generates about 1 191 kg of CO2 per passenger, and 1 191 kilograms equals 2 625 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Canberra to Astypalaia Island

See the map of the shortest flight path between Canberra Airport (CBR) and Astypalaia Island National Airport (JTY).

Airport information

Origin Canberra Airport
City: Canberra
Country: Australia Flag of Australia
IATA Code: CBR
ICAO Code: YSCB
Coordinates: 35°18′24″S, 149°11′42″E
Destination Astypalaia Island National Airport
City: Astypalaia Island
Country: Greece Flag of Greece
IATA Code: JTY
ICAO Code: LGPL
Coordinates: 36°34′47″N, 26°22′32″E