Air Miles Calculator logo

How far is Naxos from Chatham Island?

The distance between Chatham Island (Chatham Islands / Tuuta Airport) and Naxos (Naxos Island National Airport) is 11204 miles / 18031 kilometers / 9736 nautical miles.

Chatham Islands / Tuuta Airport – Naxos Island National Airport

Distance arrow
11204
Miles
Distance arrow
18031
Kilometers
Distance arrow
9736
Nautical miles
Flight time duration
21 h 42 min
Time Difference
10 h 45 min
CO2 emission
1 494 kg

Search flights

Distance from Chatham Island to Naxos

There are several ways to calculate the distance from Chatham Island to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 11204.088 miles
  • 18031.231 kilometers
  • 9736.086 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 11203.293 miles
  • 18029.953 kilometers
  • 9735.396 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chatham Island to Naxos?

The estimated flight time from Chatham Islands / Tuuta Airport to Naxos Island National Airport is 21 hours and 42 minutes.

Flight carbon footprint between Chatham Islands / Tuuta Airport (CHT) and Naxos Island National Airport (JNX)

On average, flying from Chatham Island to Naxos generates about 1 494 kg of CO2 per passenger, and 1 494 kilograms equals 3 295 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chatham Island to Naxos

See the map of the shortest flight path between Chatham Islands / Tuuta Airport (CHT) and Naxos Island National Airport (JNX).

Airport information

Origin Chatham Islands / Tuuta Airport
City: Chatham Island
Country: New Zealand Flag of New Zealand
IATA Code: CHT
ICAO Code: NZCI
Coordinates: 43°48′36″S, 176°27′25″W
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E