Air Miles Calculator logo

How far is Naxos from Ahmedabad?

The distance between Ahmedabad (Sardar Vallabhbhai Patel International Airport) and Naxos (Naxos Island National Airport) is 2958 miles / 4760 kilometers / 2570 nautical miles.

Sardar Vallabhbhai Patel International Airport – Naxos Island National Airport

Distance arrow
2958
Miles
Distance arrow
4760
Kilometers
Distance arrow
2570
Nautical miles
Flight time duration
6 h 5 min
Time Difference
3 h 30 min
CO2 emission
329 kg

Search flights

Distance from Ahmedabad to Naxos

There are several ways to calculate the distance from Ahmedabad to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 2957.604 miles
  • 4759.802 kilometers
  • 2570.087 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2953.400 miles
  • 4753.037 kilometers
  • 2566.435 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ahmedabad to Naxos?

The estimated flight time from Sardar Vallabhbhai Patel International Airport to Naxos Island National Airport is 6 hours and 5 minutes.

Flight carbon footprint between Sardar Vallabhbhai Patel International Airport (AMD) and Naxos Island National Airport (JNX)

On average, flying from Ahmedabad to Naxos generates about 329 kg of CO2 per passenger, and 329 kilograms equals 726 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ahmedabad to Naxos

See the map of the shortest flight path between Sardar Vallabhbhai Patel International Airport (AMD) and Naxos Island National Airport (JNX).

Airport information

Origin Sardar Vallabhbhai Patel International Airport
City: Ahmedabad
Country: India Flag of India
IATA Code: AMD
ICAO Code: VAAH
Coordinates: 23°4′37″N, 72°38′4″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E