How far is Astypalaia Island from Ulan-Ude?
The distance between Ulan-Ude (Baikal International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 3944 miles / 6347 kilometers / 3427 nautical miles.
Baikal International Airport – Astypalaia Island National Airport
Search flights
Distance from Ulan-Ude to Astypalaia Island
There are several ways to calculate the distance from Ulan-Ude to Astypalaia Island. Here are two standard methods:
Vincenty's formula (applied above)- 3943.907 miles
- 6347.104 kilometers
- 3427.162 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3934.187 miles
- 6331.460 kilometers
- 3418.715 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ulan-Ude to Astypalaia Island?
The estimated flight time from Baikal International Airport to Astypalaia Island National Airport is 7 hours and 58 minutes.
What is the time difference between Ulan-Ude and Astypalaia Island?
Flight carbon footprint between Baikal International Airport (UUD) and Astypalaia Island National Airport (JTY)
On average, flying from Ulan-Ude to Astypalaia Island generates about 449 kg of CO2 per passenger, and 449 kilograms equals 991 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Ulan-Ude to Astypalaia Island
See the map of the shortest flight path between Baikal International Airport (UUD) and Astypalaia Island National Airport (JTY).
Airport information
Origin | Baikal International Airport |
---|---|
City: | Ulan-Ude |
Country: | Russia |
IATA Code: | UUD |
ICAO Code: | UIUU |
Coordinates: | 51°48′28″N, 107°26′16″E |
Destination | Astypalaia Island National Airport |
---|---|
City: | Astypalaia Island |
Country: | Greece |
IATA Code: | JTY |
ICAO Code: | LGPL |
Coordinates: | 36°34′47″N, 26°22′32″E |