How far is Naxos from Jebel Ali?
The distance between Jebel Ali (Al Maktoum International Airport) and Naxos (Naxos Island National Airport) is 1946 miles / 3132 kilometers / 1691 nautical miles.
Al Maktoum International Airport – Naxos Island National Airport
Search flights
Distance from Jebel Ali to Naxos
There are several ways to calculate the distance from Jebel Ali to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 1946.187 miles
- 3132.084 kilometers
- 1691.190 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1944.075 miles
- 3128.685 kilometers
- 1689.355 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Jebel Ali to Naxos?
The estimated flight time from Al Maktoum International Airport to Naxos Island National Airport is 4 hours and 11 minutes.
What is the time difference between Jebel Ali and Naxos?
The time difference between Jebel Ali and Naxos is 2 hours. Naxos is 2 hours behind Jebel Ali.
Flight carbon footprint between Al Maktoum International Airport (DWC) and Naxos Island National Airport (JNX)
On average, flying from Jebel Ali to Naxos generates about 213 kg of CO2 per passenger, and 213 kilograms equals 469 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Jebel Ali to Naxos
See the map of the shortest flight path between Al Maktoum International Airport (DWC) and Naxos Island National Airport (JNX).
Airport information
Origin | Al Maktoum International Airport |
---|---|
City: | Jebel Ali |
Country: | United Arab Emirates |
IATA Code: | DWC |
ICAO Code: | OMDW |
Coordinates: | 24°53′46″N, 55°9′41″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |