How far is Santa Cruz De La Palma from Jerez de la Frontera?
The distance between Jerez de la Frontera (Jerez Airport) and Santa Cruz De La Palma (La Palma Airport) is 880 miles / 1417 kilometers / 765 nautical miles.
The driving distance from Jerez de la Frontera (XRY) to Santa Cruz De La Palma (SPC) is 1110 miles / 1787 kilometers, and travel time by car is about 43 hours 38 minutes.
Jerez Airport – La Palma Airport
Search flights
Distance from Jerez de la Frontera to Santa Cruz De La Palma
There are several ways to calculate the distance from Jerez de la Frontera to Santa Cruz De La Palma. Here are two standard methods:
Vincenty's formula (applied above)- 880.477 miles
- 1416.991 kilometers
- 765.114 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 880.328 miles
- 1416.750 kilometers
- 764.984 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Jerez de la Frontera to Santa Cruz De La Palma?
The estimated flight time from Jerez Airport to La Palma Airport is 2 hours and 10 minutes.
What is the time difference between Jerez de la Frontera and Santa Cruz De La Palma?
Flight carbon footprint between Jerez Airport (XRY) and La Palma Airport (SPC)
On average, flying from Jerez de la Frontera to Santa Cruz De La Palma generates about 142 kg of CO2 per passenger, and 142 kilograms equals 313 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Jerez de la Frontera to Santa Cruz De La Palma
See the map of the shortest flight path between Jerez Airport (XRY) and La Palma Airport (SPC).
Airport information
Origin | Jerez Airport |
---|---|
City: | Jerez de la Frontera |
Country: | Spain |
IATA Code: | XRY |
ICAO Code: | LEJR |
Coordinates: | 36°44′40″N, 6°3′36″W |
Destination | La Palma Airport |
---|---|
City: | Santa Cruz De La Palma |
Country: | Spain |
IATA Code: | SPC |
ICAO Code: | GCLA |
Coordinates: | 28°37′35″N, 17°45′20″W |