How far is Naxos from Dallas, TX?
The distance between Dallas (Dallas/Fort Worth International Airport) and Naxos (Naxos Island National Airport) is 6357 miles / 10230 kilometers / 5524 nautical miles.
Dallas/Fort Worth International Airport – Naxos Island National Airport
Search flights
Distance from Dallas to Naxos
There are several ways to calculate the distance from Dallas to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 6356.509 miles
- 10229.809 kilometers
- 5523.655 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6343.178 miles
- 10208.356 kilometers
- 5512.071 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Dallas to Naxos?
The estimated flight time from Dallas/Fort Worth International Airport to Naxos Island National Airport is 12 hours and 32 minutes.
What is the time difference between Dallas and Naxos?
The time difference between Dallas and Naxos is 8 hours. Naxos is 8 hours ahead of Dallas.
Flight carbon footprint between Dallas/Fort Worth International Airport (DFW) and Naxos Island National Airport (JNX)
On average, flying from Dallas to Naxos generates about 765 kg of CO2 per passenger, and 765 kilograms equals 1 687 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Dallas to Naxos
See the map of the shortest flight path between Dallas/Fort Worth International Airport (DFW) and Naxos Island National Airport (JNX).
Airport information
Origin | Dallas/Fort Worth International Airport |
---|---|
City: | Dallas, TX |
Country: | United States |
IATA Code: | DFW |
ICAO Code: | KDFW |
Coordinates: | 32°53′48″N, 97°2′16″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |