How far is Patras from Sharm el-Sheikh?
The distance between Sharm el-Sheikh (Sharm El Sheikh International Airport) and Patras (Patras Araxos Airport) is 1027 miles / 1652 kilometers / 892 nautical miles.
Sharm El Sheikh International Airport – Patras Araxos Airport
Search flights
Distance from Sharm el-Sheikh to Patras
There are several ways to calculate the distance from Sharm el-Sheikh to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 1026.561 miles
- 1652.090 kilometers
- 892.057 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1026.652 miles
- 1652.237 kilometers
- 892.136 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sharm el-Sheikh to Patras?
The estimated flight time from Sharm El Sheikh International Airport to Patras Araxos Airport is 2 hours and 26 minutes.
What is the time difference between Sharm el-Sheikh and Patras?
There is no time difference between Sharm el-Sheikh and Patras.
Flight carbon footprint between Sharm El Sheikh International Airport (SSH) and Patras Araxos Airport (GPA)
On average, flying from Sharm el-Sheikh to Patras generates about 152 kg of CO2 per passenger, and 152 kilograms equals 336 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Sharm el-Sheikh to Patras
See the map of the shortest flight path between Sharm El Sheikh International Airport (SSH) and Patras Araxos Airport (GPA).
Airport information
Origin | Sharm El Sheikh International Airport |
---|---|
City: | Sharm el-Sheikh |
Country: | Egypt |
IATA Code: | SSH |
ICAO Code: | HESH |
Coordinates: | 27°58′38″N, 34°23′42″E |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |