How far is Astypalaia Island from Bukhara?
The distance between Bukhara (Bukhara International Airport) and Astypalaia Island (Astypalaia Island National Airport) is 2071 miles / 3333 kilometers / 1799 nautical miles.
Bukhara International Airport – Astypalaia Island National Airport
Search flights
Distance from Bukhara to Astypalaia Island
There are several ways to calculate the distance from Bukhara to Astypalaia Island. Here are two standard methods:
Vincenty's formula (applied above)- 2070.752 miles
- 3332.552 kilometers
- 1799.434 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2065.891 miles
- 3324.729 kilometers
- 1795.210 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bukhara to Astypalaia Island?
The estimated flight time from Bukhara International Airport to Astypalaia Island National Airport is 4 hours and 25 minutes.
What is the time difference between Bukhara and Astypalaia Island?
Flight carbon footprint between Bukhara International Airport (BHK) and Astypalaia Island National Airport (JTY)
On average, flying from Bukhara to Astypalaia Island generates about 225 kg of CO2 per passenger, and 225 kilograms equals 497 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Bukhara to Astypalaia Island
See the map of the shortest flight path between Bukhara International Airport (BHK) and Astypalaia Island National Airport (JTY).
Airport information
Origin | Bukhara International Airport |
---|---|
City: | Bukhara |
Country: | Uzbekistan |
IATA Code: | BHK |
ICAO Code: | UTSB |
Coordinates: | 39°46′30″N, 64°28′59″E |
Destination | Astypalaia Island National Airport |
---|---|
City: | Astypalaia Island |
Country: | Greece |
IATA Code: | JTY |
ICAO Code: | LGPL |
Coordinates: | 36°34′47″N, 26°22′32″E |