Air Miles Calculator logo

How far is Naxos from Fort McMurray?

The distance between Fort McMurray (Fort McMurray International Airport) and Naxos (Naxos Island National Airport) is 5496 miles / 8844 kilometers / 4776 nautical miles.

Fort McMurray International Airport – Naxos Island National Airport

Distance arrow
5496
Miles
Distance arrow
8844
Kilometers
Distance arrow
4776
Nautical miles

Search flights

Distance from Fort McMurray to Naxos

There are several ways to calculate the distance from Fort McMurray to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 5495.672 miles
  • 8844.427 kilometers
  • 4775.609 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5481.420 miles
  • 8821.491 kilometers
  • 4763.224 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort McMurray to Naxos?

The estimated flight time from Fort McMurray International Airport to Naxos Island National Airport is 10 hours and 54 minutes.

Flight carbon footprint between Fort McMurray International Airport (YMM) and Naxos Island National Airport (JNX)

On average, flying from Fort McMurray to Naxos generates about 649 kg of CO2 per passenger, and 649 kilograms equals 1 431 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Fort McMurray to Naxos

See the map of the shortest flight path between Fort McMurray International Airport (YMM) and Naxos Island National Airport (JNX).

Airport information

Origin Fort McMurray International Airport
City: Fort McMurray
Country: Canada Flag of Canada
IATA Code: YMM
ICAO Code: CYMM
Coordinates: 56°39′11″N, 111°13′19″W
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E