Air Miles Calculator logo

How far is Patras from Santa Cruz De La Palma?

The distance between Santa Cruz De La Palma (La Palma Airport) and Patras (Patras Araxos Airport) is 2339 miles / 3765 kilometers / 2033 nautical miles.

The driving distance from Santa Cruz De La Palma (SPC) to Patras (GPA) is 3212 miles / 5170 kilometers, and travel time by car is about 85 hours 15 minutes.

La Palma Airport – Patras Araxos Airport

Distance arrow
2339
Miles
Distance arrow
3765
Kilometers
Distance arrow
2033
Nautical miles

Search flights

Distance from Santa Cruz De La Palma to Patras

There are several ways to calculate the distance from Santa Cruz De La Palma to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 2339.280 miles
  • 3764.707 kilometers
  • 2032.779 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2335.166 miles
  • 3758.085 kilometers
  • 2029.204 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz De La Palma to Patras?

The estimated flight time from La Palma Airport to Patras Araxos Airport is 4 hours and 55 minutes.

Flight carbon footprint between La Palma Airport (SPC) and Patras Araxos Airport (GPA)

On average, flying from Santa Cruz De La Palma to Patras generates about 256 kg of CO2 per passenger, and 256 kilograms equals 565 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Cruz De La Palma to Patras

See the map of the shortest flight path between La Palma Airport (SPC) and Patras Araxos Airport (GPA).

Airport information

Origin La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E