Air Miles Calculator logo

How far is Naxos from Isfahan?

The distance between Isfahan (Isfahan International Airport) and Naxos (Naxos Island National Airport) is 1528 miles / 2460 kilometers / 1328 nautical miles.

Isfahan International Airport – Naxos Island National Airport

Distance arrow
1528
Miles
Distance arrow
2460
Kilometers
Distance arrow
1328
Nautical miles
Flight time duration
3 h 23 min
Time Difference
1 h 30 min
CO2 emission
181 kg

Search flights

Distance from Isfahan to Naxos

There are several ways to calculate the distance from Isfahan to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 1528.426 miles
  • 2459.764 kilometers
  • 1328.166 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1525.308 miles
  • 2454.746 kilometers
  • 1325.457 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Isfahan to Naxos?

The estimated flight time from Isfahan International Airport to Naxos Island National Airport is 3 hours and 23 minutes.

Flight carbon footprint between Isfahan International Airport (IFN) and Naxos Island National Airport (JNX)

On average, flying from Isfahan to Naxos generates about 181 kg of CO2 per passenger, and 181 kilograms equals 400 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Isfahan to Naxos

See the map of the shortest flight path between Isfahan International Airport (IFN) and Naxos Island National Airport (JNX).

Airport information

Origin Isfahan International Airport
City: Isfahan
Country: Iran Flag of Iran
IATA Code: IFN
ICAO Code: OIFM
Coordinates: 32°45′2″N, 51°51′40″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E