How far is Naxos from Barrow, AK?
The distance between Barrow (Wiley Post–Will Rogers Memorial Airport) and Naxos (Naxos Island National Airport) is 4962 miles / 7985 kilometers / 4311 nautical miles.
Wiley Post–Will Rogers Memorial Airport – Naxos Island National Airport
Search flights
Distance from Barrow to Naxos
There are several ways to calculate the distance from Barrow to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 4961.544 miles
- 7984.831 kilometers
- 4311.464 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4948.663 miles
- 7964.101 kilometers
- 4300.270 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Barrow to Naxos?
The estimated flight time from Wiley Post–Will Rogers Memorial Airport to Naxos Island National Airport is 9 hours and 53 minutes.
What is the time difference between Barrow and Naxos?
The time difference between Barrow and Naxos is 11 hours. Naxos is 11 hours ahead of Barrow.
Flight carbon footprint between Wiley Post–Will Rogers Memorial Airport (BRW) and Naxos Island National Airport (JNX)
On average, flying from Barrow to Naxos generates about 579 kg of CO2 per passenger, and 579 kilograms equals 1 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Barrow to Naxos
See the map of the shortest flight path between Wiley Post–Will Rogers Memorial Airport (BRW) and Naxos Island National Airport (JNX).
Airport information
Origin | Wiley Post–Will Rogers Memorial Airport |
---|---|
City: | Barrow, AK |
Country: | United States |
IATA Code: | BRW |
ICAO Code: | PABR |
Coordinates: | 71°17′7″N, 156°45′57″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |