How far is Astypalaia Island from Tucson, AZ?
The distance between Tucson (Tucson International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 6961 miles / 11203 kilometers / 6049 nautical miles.
Tucson International Airport – Astypalaia Island National Airport
Search flights
Distance from Tucson to Astypalaia Island
There are several ways to calculate the distance from Tucson to Astypalaia Island. Here are two standard methods:
Vincenty's formula (applied above)- 6961.482 miles
- 11203.420 kilometers
- 6049.363 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6947.595 miles
- 11181.070 kilometers
- 6037.295 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tucson to Astypalaia Island?
The estimated flight time from Tucson International Airport to Astypalaia Island National Airport is 13 hours and 40 minutes.
What is the time difference between Tucson and Astypalaia Island?
Flight carbon footprint between Tucson International Airport (TUS) and Astypalaia Island National Airport (JTY)
On average, flying from Tucson to Astypalaia Island generates about 849 kg of CO2 per passenger, and 849 kilograms equals 1 873 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tucson to Astypalaia Island
See the map of the shortest flight path between Tucson International Airport (TUS) and Astypalaia Island National Airport (JTY).
Airport information
Origin | Tucson International Airport |
---|---|
City: | Tucson, AZ |
Country: | United States |
IATA Code: | TUS |
ICAO Code: | KTUS |
Coordinates: | 32°6′57″N, 110°56′27″W |
Destination | Astypalaia Island National Airport |
---|---|
City: | Astypalaia Island |
Country: | Greece |
IATA Code: | JTY |
ICAO Code: | LGPL |
Coordinates: | 36°34′47″N, 26°22′32″E |