How far is Naxos from Baghdad?
The distance between Baghdad (Baghdad International Airport) and Naxos (Naxos Island National Airport) is 1098 miles / 1767 kilometers / 954 nautical miles.
Baghdad International Airport – Naxos Island National Airport
Search flights
Distance from Baghdad to Naxos
There are several ways to calculate the distance from Baghdad to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 1097.860 miles
- 1766.835 kilometers
- 954.014 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1095.699 miles
- 1763.357 kilometers
- 952.137 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Baghdad to Naxos?
The estimated flight time from Baghdad International Airport to Naxos Island National Airport is 2 hours and 34 minutes.
What is the time difference between Baghdad and Naxos?
The time difference between Baghdad and Naxos is 1 hour. Naxos is 1 hour behind Baghdad.
Flight carbon footprint between Baghdad International Airport (BGW) and Naxos Island National Airport (JNX)
On average, flying from Baghdad to Naxos generates about 157 kg of CO2 per passenger, and 157 kilograms equals 345 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Baghdad to Naxos
See the map of the shortest flight path between Baghdad International Airport (BGW) and Naxos Island National Airport (JNX).
Airport information
Origin | Baghdad International Airport |
---|---|
City: | Baghdad |
Country: | Iraq |
IATA Code: | BGW |
ICAO Code: | ORBI |
Coordinates: | 33°15′45″N, 44°14′4″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |