How far is Patras from Phoenix, AZ?
The distance between Phoenix (Phoenix Sky Harbor International Airport) and Patras (Patras Araxos Airport) is 6672 miles / 10738 kilometers / 5798 nautical miles.
Phoenix Sky Harbor International Airport – Patras Araxos Airport
Search flights
Distance from Phoenix to Patras
There are several ways to calculate the distance from Phoenix to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 6672.172 miles
- 10737.820 kilometers
- 5797.959 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6658.157 miles
- 10715.265 kilometers
- 5785.780 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Phoenix to Patras?
The estimated flight time from Phoenix Sky Harbor International Airport to Patras Araxos Airport is 13 hours and 7 minutes.
What is the time difference between Phoenix and Patras?
The time difference between Phoenix and Patras is 9 hours. Patras is 9 hours ahead of Phoenix.
Flight carbon footprint between Phoenix Sky Harbor International Airport (PHX) and Patras Araxos Airport (GPA)
On average, flying from Phoenix to Patras generates about 809 kg of CO2 per passenger, and 809 kilograms equals 1 783 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Phoenix to Patras
See the map of the shortest flight path between Phoenix Sky Harbor International Airport (PHX) and Patras Araxos Airport (GPA).
Airport information
Origin | Phoenix Sky Harbor International Airport |
---|---|
City: | Phoenix, AZ |
Country: | United States |
IATA Code: | PHX |
ICAO Code: | KPHX |
Coordinates: | 33°26′3″N, 112°0′43″W |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |