How far is Naxos from Punta Cana?
The distance between Punta Cana (Punta Cana International Airport) and Naxos (Naxos Island National Airport) is 5660 miles / 9109 kilometers / 4918 nautical miles.
Punta Cana International Airport – Naxos Island National Airport
Search flights
Distance from Punta Cana to Naxos
There are several ways to calculate the distance from Punta Cana to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5660.019 miles
- 9108.918 kilometers
- 4918.422 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5651.271 miles
- 9094.839 kilometers
- 4910.820 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Punta Cana to Naxos?
The estimated flight time from Punta Cana International Airport to Naxos Island National Airport is 11 hours and 12 minutes.
What is the time difference between Punta Cana and Naxos?
The time difference between Punta Cana and Naxos is 6 hours. Naxos is 6 hours ahead of Punta Cana.
Flight carbon footprint between Punta Cana International Airport (PUJ) and Naxos Island National Airport (JNX)
On average, flying from Punta Cana to Naxos generates about 671 kg of CO2 per passenger, and 671 kilograms equals 1 479 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Punta Cana to Naxos
See the map of the shortest flight path between Punta Cana International Airport (PUJ) and Naxos Island National Airport (JNX).
Airport information
Origin | Punta Cana International Airport |
---|---|
City: | Punta Cana |
Country: | Dominican Republic |
IATA Code: | PUJ |
ICAO Code: | MDPC |
Coordinates: | 18°34′2″N, 68°21′48″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |