How far is Santorini from Beijing?
The distance between Beijing (Beijing Nanyuan Airport) and Santorini (Santorini (Thira) International Airport) is 4730 miles / 7611 kilometers / 4110 nautical miles.
Beijing Nanyuan Airport – Santorini (Thira) International Airport
Search flights
Distance from Beijing to Santorini
There are several ways to calculate the distance from Beijing to Santorini. Here are two standard methods:
Vincenty's formula (applied above)- 4729.549 miles
- 7611.471 kilometers
- 4109.865 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4718.430 miles
- 7593.577 kilometers
- 4100.203 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Santorini?
The estimated flight time from Beijing Nanyuan Airport to Santorini (Thira) International Airport is 9 hours and 27 minutes.
What is the time difference between Beijing and Santorini?
The time difference between Beijing and Santorini is 6 hours. Santorini is 6 hours behind Beijing.
Flight carbon footprint between Beijing Nanyuan Airport (NAY) and Santorini (Thira) International Airport (JTR)
On average, flying from Beijing to Santorini generates about 549 kg of CO2 per passenger, and 549 kilograms equals 1 210 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Beijing to Santorini
See the map of the shortest flight path between Beijing Nanyuan Airport (NAY) and Santorini (Thira) International Airport (JTR).
Airport information
Origin | Beijing Nanyuan Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | NAY |
ICAO Code: | ZBNY |
Coordinates: | 39°46′58″N, 116°23′16″E |
Destination | Santorini (Thira) International Airport |
---|---|
City: | Santorini |
Country: | Greece |
IATA Code: | JTR |
ICAO Code: | LGSR |
Coordinates: | 36°23′57″N, 25°28′45″E |