Air Miles Calculator logo

How far is Pisa from Naxos?

The distance between Naxos (Naxos Island National Airport) and Pisa (Pisa International Airport) is 910 miles / 1465 kilometers / 791 nautical miles.

The driving distance from Naxos (JNX) to Pisa (PSA) is 1553 miles / 2499 kilometers, and travel time by car is about 41 hours 55 minutes.

Naxos Island National Airport – Pisa International Airport

Distance arrow
910
Miles
Distance arrow
1465
Kilometers
Distance arrow
791
Nautical miles

Search flights

Distance from Naxos to Pisa

There are several ways to calculate the distance from Naxos to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 910.066 miles
  • 1464.609 kilometers
  • 790.826 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 908.663 miles
  • 1462.351 kilometers
  • 789.607 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Naxos to Pisa?

The estimated flight time from Naxos Island National Airport to Pisa International Airport is 2 hours and 13 minutes.

Flight carbon footprint between Naxos Island National Airport (JNX) and Pisa International Airport (PSA)

On average, flying from Naxos to Pisa generates about 144 kg of CO2 per passenger, and 144 kilograms equals 318 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Naxos to Pisa

See the map of the shortest flight path between Naxos Island National Airport (JNX) and Pisa International Airport (PSA).

Airport information

Origin Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E