How far is Naxos from Shanghai?
The distance between Shanghai (Shanghai Pudong International Airport) and Naxos (Naxos Island National Airport) is 5289 miles / 8512 kilometers / 4596 nautical miles.
Shanghai Pudong International Airport – Naxos Island National Airport
Search flights
Distance from Shanghai to Naxos
There are several ways to calculate the distance from Shanghai to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 5289.268 miles
- 8512.251 kilometers
- 4596.248 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5278.160 miles
- 8494.375 kilometers
- 4586.596 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shanghai to Naxos?
The estimated flight time from Shanghai Pudong International Airport to Naxos Island National Airport is 10 hours and 30 minutes.
What is the time difference between Shanghai and Naxos?
The time difference between Shanghai and Naxos is 6 hours. Naxos is 6 hours behind Shanghai.
Flight carbon footprint between Shanghai Pudong International Airport (PVG) and Naxos Island National Airport (JNX)
On average, flying from Shanghai to Naxos generates about 622 kg of CO2 per passenger, and 622 kilograms equals 1 371 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Shanghai to Naxos
See the map of the shortest flight path between Shanghai Pudong International Airport (PVG) and Naxos Island National Airport (JNX).
Airport information
Origin | Shanghai Pudong International Airport |
---|---|
City: | Shanghai |
Country: | China |
IATA Code: | PVG |
ICAO Code: | ZSPD |
Coordinates: | 31°8′36″N, 121°48′18″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |