How far is Naxos from Toronto?
The distance between Toronto (Toronto Pearson International Airport) and Naxos (Naxos Island National Airport) is 5161 miles / 8305 kilometers / 4485 nautical miles.
Toronto Pearson International Airport – Naxos Island National Airport
Search flights
Distance from Toronto to Naxos
There are several ways to calculate the distance from Toronto to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5160.757 miles
- 8305.433 kilometers
- 4484.575 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5148.163 miles
- 8285.166 kilometers
- 4473.631 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Toronto to Naxos?
The estimated flight time from Toronto Pearson International Airport to Naxos Island National Airport is 10 hours and 16 minutes.
What is the time difference between Toronto and Naxos?
The time difference between Toronto and Naxos is 7 hours. Naxos is 7 hours ahead of Toronto.
Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Naxos Island National Airport (JNX)
On average, flying from Toronto to Naxos generates about 605 kg of CO2 per passenger, and 605 kilograms equals 1 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Toronto to Naxos
See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Naxos Island National Airport (JNX).
Airport information
Origin | Toronto Pearson International Airport |
---|---|
City: | Toronto |
Country: | Canada |
IATA Code: | YYZ |
ICAO Code: | CYYZ |
Coordinates: | 43°40′37″N, 79°37′50″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |