Air Miles Calculator logo

How far is Naxos from Çorlu?

The distance between Çorlu (Tekirdağ Çorlu Airport) and Naxos (Naxos Island National Airport) is 312 miles / 501 kilometers / 271 nautical miles.

The driving distance from Çorlu (TEQ) to Naxos (JNX) is 820 miles / 1320 kilometers, and travel time by car is about 30 hours 24 minutes.

Tekirdağ Çorlu Airport – Naxos Island National Airport

Distance arrow
312
Miles
Distance arrow
501
Kilometers
Distance arrow
271
Nautical miles

Search flights

Distance from Çorlu to Naxos

There are several ways to calculate the distance from Çorlu to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 311.609 miles
  • 501.486 kilometers
  • 270.781 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 311.863 miles
  • 501.896 kilometers
  • 271.002 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Çorlu to Naxos?

The estimated flight time from Tekirdağ Çorlu Airport to Naxos Island National Airport is 1 hour and 5 minutes.

Flight carbon footprint between Tekirdağ Çorlu Airport (TEQ) and Naxos Island National Airport (JNX)

On average, flying from Çorlu to Naxos generates about 71 kg of CO2 per passenger, and 71 kilograms equals 156 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Çorlu to Naxos

See the map of the shortest flight path between Tekirdağ Çorlu Airport (TEQ) and Naxos Island National Airport (JNX).

Airport information

Origin Tekirdağ Çorlu Airport
City: Çorlu
Country: Turkey Flag of Turkey
IATA Code: TEQ
ICAO Code: LTBU
Coordinates: 41°8′17″N, 27°55′8″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E