How far is Jerez de la Frontera from Santa Cruz De La Palma?
The distance between Santa Cruz De La Palma (La Palma Airport) and Jerez de la Frontera (Jerez Airport) is 880 miles / 1417 kilometers / 765 nautical miles.
The driving distance from Santa Cruz De La Palma (SPC) to Jerez de la Frontera (XRY) is 1107 miles / 1781 kilometers, and travel time by car is about 43 hours 12 minutes.
La Palma Airport – Jerez Airport
Search flights
Distance from Santa Cruz De La Palma to Jerez de la Frontera
There are several ways to calculate the distance from Santa Cruz De La Palma to Jerez de la Frontera. Here are two standard methods:
Vincenty's formula (applied above)- 880.477 miles
- 1416.991 kilometers
- 765.114 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 880.328 miles
- 1416.750 kilometers
- 764.984 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Santa Cruz De La Palma to Jerez de la Frontera?
The estimated flight time from La Palma Airport to Jerez Airport is 2 hours and 10 minutes.
What is the time difference between Santa Cruz De La Palma and Jerez de la Frontera?
Flight carbon footprint between La Palma Airport (SPC) and Jerez Airport (XRY)
On average, flying from Santa Cruz De La Palma to Jerez de la Frontera generates about 142 kg of CO2 per passenger, and 142 kilograms equals 313 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Santa Cruz De La Palma to Jerez de la Frontera
See the map of the shortest flight path between La Palma Airport (SPC) and Jerez Airport (XRY).
Airport information
Origin | La Palma Airport |
---|---|
City: | Santa Cruz De La Palma |
Country: | Spain |
IATA Code: | SPC |
ICAO Code: | GCLA |
Coordinates: | 28°37′35″N, 17°45′20″W |
Destination | Jerez Airport |
---|---|
City: | Jerez de la Frontera |
Country: | Spain |
IATA Code: | XRY |
ICAO Code: | LEJR |
Coordinates: | 36°44′40″N, 6°3′36″W |