Air Miles Calculator logo

How far is Naxos from Hurghada?

The distance between Hurghada (Hurghada International Airport) and Naxos (Naxos Island National Airport) is 842 miles / 1355 kilometers / 731 nautical miles.

Hurghada International Airport – Naxos Island National Airport

Distance arrow
842
Miles
Distance arrow
1355
Kilometers
Distance arrow
731
Nautical miles

Search flights

Distance from Hurghada to Naxos

There are several ways to calculate the distance from Hurghada to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 841.734 miles
  • 1354.639 kilometers
  • 731.447 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 842.661 miles
  • 1356.132 kilometers
  • 732.252 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hurghada to Naxos?

The estimated flight time from Hurghada International Airport to Naxos Island National Airport is 2 hours and 5 minutes.

What is the time difference between Hurghada and Naxos?

There is no time difference between Hurghada and Naxos.

Flight carbon footprint between Hurghada International Airport (HRG) and Naxos Island National Airport (JNX)

On average, flying from Hurghada to Naxos generates about 139 kg of CO2 per passenger, and 139 kilograms equals 306 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hurghada to Naxos

See the map of the shortest flight path between Hurghada International Airport (HRG) and Naxos Island National Airport (JNX).

Airport information

Origin Hurghada International Airport
City: Hurghada
Country: Egypt Flag of Egypt
IATA Code: HRG
ICAO Code: HEGN
Coordinates: 27°10′41″N, 33°47′57″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E