How far is Astypalaia Island from Aqaba?
The distance between Aqaba (King Hussein International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 694 miles / 1116 kilometers / 603 nautical miles.
King Hussein International Airport – Astypalaia Island National Airport
Search flights
Distance from Aqaba to Astypalaia Island
There are several ways to calculate the distance from Aqaba to Astypalaia Island. Here are two standard methods:
Vincenty's formula (applied above)- 693.591 miles
- 1116.226 kilometers
- 602.714 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 693.694 miles
- 1116.392 kilometers
- 602.803 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Aqaba to Astypalaia Island?
The estimated flight time from King Hussein International Airport to Astypalaia Island National Airport is 1 hour and 48 minutes.
What is the time difference between Aqaba and Astypalaia Island?
Flight carbon footprint between King Hussein International Airport (AQJ) and Astypalaia Island National Airport (JTY)
On average, flying from Aqaba to Astypalaia Island generates about 124 kg of CO2 per passenger, and 124 kilograms equals 273 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Aqaba to Astypalaia Island
See the map of the shortest flight path between King Hussein International Airport (AQJ) and Astypalaia Island National Airport (JTY).
Airport information
Origin | King Hussein International Airport |
---|---|
City: | Aqaba |
Country: | Jordan |
IATA Code: | AQJ |
ICAO Code: | OJAQ |
Coordinates: | 29°36′41″N, 35°1′5″E |
Destination | Astypalaia Island National Airport |
---|---|
City: | Astypalaia Island |
Country: | Greece |
IATA Code: | JTY |
ICAO Code: | LGPL |
Coordinates: | 36°34′47″N, 26°22′32″E |