How far is Santa Cruz De La Palma from Sevilla?
The distance between Sevilla (Seville Airport) and Santa Cruz De La Palma (La Palma Airport) is 916 miles / 1474 kilometers / 796 nautical miles.
The driving distance from Sevilla (SVQ) to Santa Cruz De La Palma (SPC) is 1062 miles / 1709 kilometers, and travel time by car is about 42 hours 44 minutes.
Seville Airport – La Palma Airport
Search flights
Distance from Sevilla to Santa Cruz De La Palma
There are several ways to calculate the distance from Sevilla to Santa Cruz De La Palma. Here are two standard methods:
Vincenty's formula (applied above)- 915.810 miles
- 1473.854 kilometers
- 795.818 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 915.772 miles
- 1473.792 kilometers
- 795.784 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sevilla to Santa Cruz De La Palma?
The estimated flight time from Seville Airport to La Palma Airport is 2 hours and 14 minutes.
What is the time difference between Sevilla and Santa Cruz De La Palma?
Flight carbon footprint between Seville Airport (SVQ) and La Palma Airport (SPC)
On average, flying from Sevilla to Santa Cruz De La Palma generates about 145 kg of CO2 per passenger, and 145 kilograms equals 319 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Sevilla to Santa Cruz De La Palma
See the map of the shortest flight path between Seville Airport (SVQ) and La Palma Airport (SPC).
Airport information
Origin | Seville Airport |
---|---|
City: | Sevilla |
Country: | Spain |
IATA Code: | SVQ |
ICAO Code: | LEZL |
Coordinates: | 37°25′4″N, 5°53′35″W |
Destination | La Palma Airport |
---|---|
City: | Santa Cruz De La Palma |
Country: | Spain |
IATA Code: | SPC |
ICAO Code: | GCLA |
Coordinates: | 28°37′35″N, 17°45′20″W |