How far is Santa Cruz De La Palma from Badajoz?
The distance between Badajoz (Badajoz Airport) and Santa Cruz De La Palma (La Palma Airport) is 945 miles / 1522 kilometers / 822 nautical miles.
The driving distance from Badajoz (BJZ) to Santa Cruz De La Palma (SPC) is 1150 miles / 1851 kilometers, and travel time by car is about 44 hours 40 minutes.
Badajoz Airport – La Palma Airport
Search flights
Distance from Badajoz to Santa Cruz De La Palma
There are several ways to calculate the distance from Badajoz to Santa Cruz De La Palma. Here are two standard methods:
Vincenty's formula (applied above)- 945.445 miles
- 1521.546 kilometers
- 821.569 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 945.870 miles
- 1522.230 kilometers
- 821.938 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Badajoz to Santa Cruz De La Palma?
The estimated flight time from Badajoz Airport to La Palma Airport is 2 hours and 17 minutes.
What is the time difference between Badajoz and Santa Cruz De La Palma?
Flight carbon footprint between Badajoz Airport (BJZ) and La Palma Airport (SPC)
On average, flying from Badajoz to Santa Cruz De La Palma generates about 147 kg of CO2 per passenger, and 147 kilograms equals 324 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Badajoz to Santa Cruz De La Palma
See the map of the shortest flight path between Badajoz Airport (BJZ) and La Palma Airport (SPC).
Airport information
Origin | Badajoz Airport |
---|---|
City: | Badajoz |
Country: | Spain |
IATA Code: | BJZ |
ICAO Code: | LEBZ |
Coordinates: | 38°53′28″N, 6°49′16″W |
Destination | La Palma Airport |
---|---|
City: | Santa Cruz De La Palma |
Country: | Spain |
IATA Code: | SPC |
ICAO Code: | GCLA |
Coordinates: | 28°37′35″N, 17°45′20″W |