How far is Naxos from Buffalo, NY?
The distance between Buffalo (Buffalo Niagara International Airport) and Naxos (Naxos Island National Airport) is 5155 miles / 8296 kilometers / 4479 nautical miles.
Buffalo Niagara International Airport – Naxos Island National Airport
Search flights
Distance from Buffalo to Naxos
There are several ways to calculate the distance from Buffalo to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5154.913 miles
- 8296.029 kilometers
- 4479.497 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5142.404 miles
- 8275.897 kilometers
- 4468.627 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Buffalo to Naxos?
The estimated flight time from Buffalo Niagara International Airport to Naxos Island National Airport is 10 hours and 15 minutes.
What is the time difference between Buffalo and Naxos?
The time difference between Buffalo and Naxos is 7 hours. Naxos is 7 hours ahead of Buffalo.
Flight carbon footprint between Buffalo Niagara International Airport (BUF) and Naxos Island National Airport (JNX)
On average, flying from Buffalo to Naxos generates about 604 kg of CO2 per passenger, and 604 kilograms equals 1 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Buffalo to Naxos
See the map of the shortest flight path between Buffalo Niagara International Airport (BUF) and Naxos Island National Airport (JNX).
Airport information
Origin | Buffalo Niagara International Airport |
---|---|
City: | Buffalo, NY |
Country: | United States |
IATA Code: | BUF |
ICAO Code: | KBUF |
Coordinates: | 42°56′25″N, 78°43′55″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |