How far is Naxos from Montego Bay?
The distance between Montego Bay (Sangster International Airport) and Naxos (Naxos Island National Airport) is 6159 miles / 9911 kilometers / 5352 nautical miles.
Sangster International Airport – Naxos Island National Airport
Search flights
Distance from Montego Bay to Naxos
There are several ways to calculate the distance from Montego Bay to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 6158.550 miles
- 9911.226 kilometers
- 5351.634 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6148.928 miles
- 9895.740 kilometers
- 5343.272 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Montego Bay to Naxos?
The estimated flight time from Sangster International Airport to Naxos Island National Airport is 12 hours and 9 minutes.
What is the time difference between Montego Bay and Naxos?
The time difference between Montego Bay and Naxos is 7 hours. Naxos is 7 hours ahead of Montego Bay.
Flight carbon footprint between Sangster International Airport (MBJ) and Naxos Island National Airport (JNX)
On average, flying from Montego Bay to Naxos generates about 738 kg of CO2 per passenger, and 738 kilograms equals 1 627 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Montego Bay to Naxos
See the map of the shortest flight path between Sangster International Airport (MBJ) and Naxos Island National Airport (JNX).
Airport information
Origin | Sangster International Airport |
---|---|
City: | Montego Bay |
Country: | Jamaica |
IATA Code: | MBJ |
ICAO Code: | MKJS |
Coordinates: | 18°30′13″N, 77°54′48″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |