How far is Patras from Savannah, GA?
The distance between Savannah (Savannah/Hilton Head International Airport) and Patras (Patras Araxos Airport) is 5502 miles / 8855 kilometers / 4781 nautical miles.
Savannah/Hilton Head International Airport – Patras Araxos Airport
Search flights
Distance from Savannah to Patras
There are several ways to calculate the distance from Savannah to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 5501.950 miles
- 8854.530 kilometers
- 4781.064 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5490.141 miles
- 8835.525 kilometers
- 4770.802 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Savannah to Patras?
The estimated flight time from Savannah/Hilton Head International Airport to Patras Araxos Airport is 10 hours and 55 minutes.
What is the time difference between Savannah and Patras?
The time difference between Savannah and Patras is 7 hours. Patras is 7 hours ahead of Savannah.
Flight carbon footprint between Savannah/Hilton Head International Airport (SAV) and Patras Araxos Airport (GPA)
On average, flying from Savannah to Patras generates about 650 kg of CO2 per passenger, and 650 kilograms equals 1 433 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Savannah to Patras
See the map of the shortest flight path between Savannah/Hilton Head International Airport (SAV) and Patras Araxos Airport (GPA).
Airport information
Origin | Savannah/Hilton Head International Airport |
---|---|
City: | Savannah, GA |
Country: | United States |
IATA Code: | SAV |
ICAO Code: | KSAV |
Coordinates: | 32°7′39″N, 81°12′7″W |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |