How far is Naxos from Pittsburgh, PA?
The distance between Pittsburgh (Pittsburgh International Airport) and Naxos (Naxos Island National Airport) is 5319 miles / 8560 kilometers / 4622 nautical miles.
Pittsburgh International Airport – Naxos Island National Airport
Search flights
Distance from Pittsburgh to Naxos
There are several ways to calculate the distance from Pittsburgh to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5318.998 miles
- 8560.097 kilometers
- 4622.083 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5306.397 miles
- 8539.819 kilometers
- 4611.133 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Pittsburgh to Naxos?
The estimated flight time from Pittsburgh International Airport to Naxos Island National Airport is 10 hours and 34 minutes.
What is the time difference between Pittsburgh and Naxos?
The time difference between Pittsburgh and Naxos is 7 hours. Naxos is 7 hours ahead of Pittsburgh.
Flight carbon footprint between Pittsburgh International Airport (PIT) and Naxos Island National Airport (JNX)
On average, flying from Pittsburgh to Naxos generates about 626 kg of CO2 per passenger, and 626 kilograms equals 1 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Pittsburgh to Naxos
See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Naxos Island National Airport (JNX).
Airport information
Origin | Pittsburgh International Airport |
---|---|
City: | Pittsburgh, PA |
Country: | United States |
IATA Code: | PIT |
ICAO Code: | KPIT |
Coordinates: | 40°29′29″N, 80°13′58″W |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |