Air Miles Calculator logo

How far is Patras from Palermo?

The distance between Palermo (Falcone Borsellino Airport) and Patras (Patras Araxos Airport) is 454 miles / 730 kilometers / 394 nautical miles.

The driving distance from Palermo (PMO) to Patras (GPA) is 791 miles / 1273 kilometers, and travel time by car is about 22 hours 26 minutes.

Falcone Borsellino Airport – Patras Araxos Airport

Distance arrow
454
Miles
Distance arrow
730
Kilometers
Distance arrow
394
Nautical miles

Search flights

Distance from Palermo to Patras

There are several ways to calculate the distance from Palermo to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 453.712 miles
  • 730.179 kilometers
  • 394.265 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 452.625 miles
  • 728.430 kilometers
  • 393.321 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Palermo to Patras?

The estimated flight time from Falcone Borsellino Airport to Patras Araxos Airport is 1 hour and 21 minutes.

Flight carbon footprint between Falcone Borsellino Airport (PMO) and Patras Araxos Airport (GPA)

On average, flying from Palermo to Patras generates about 92 kg of CO2 per passenger, and 92 kilograms equals 202 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Palermo to Patras

See the map of the shortest flight path between Falcone Borsellino Airport (PMO) and Patras Araxos Airport (GPA).

Airport information

Origin Falcone Borsellino Airport
City: Palermo
Country: Italy Flag of Italy
IATA Code: PMO
ICAO Code: LICJ
Coordinates: 38°10′33″N, 13°5′27″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E