Air Miles Calculator logo

How far is London from Santa Cruz De La Palma?

The distance between Santa Cruz De La Palma (La Palma Airport) and London (London City Airport) is 1828 miles / 2942 kilometers / 1589 nautical miles.

The driving distance from Santa Cruz De La Palma (SPC) to London (LCY) is 2385 miles / 3839 kilometers, and travel time by car is about 65 hours 7 minutes.

La Palma Airport – London City Airport

Distance arrow
1828
Miles
Distance arrow
2942
Kilometers
Distance arrow
1589
Nautical miles

Search flights

Distance from Santa Cruz De La Palma to London

There are several ways to calculate the distance from Santa Cruz De La Palma to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1828.349 miles
  • 2942.443 kilometers
  • 1588.792 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1829.122 miles
  • 2943.686 kilometers
  • 1589.463 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz De La Palma to London?

The estimated flight time from La Palma Airport to London City Airport is 3 hours and 57 minutes.

What is the time difference between Santa Cruz De La Palma and London?

There is no time difference between Santa Cruz De La Palma and London.

Flight carbon footprint between La Palma Airport (SPC) and London City Airport (LCY)

On average, flying from Santa Cruz De La Palma to London generates about 202 kg of CO2 per passenger, and 202 kilograms equals 446 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Cruz De La Palma to London

See the map of the shortest flight path between La Palma Airport (SPC) and London City Airport (LCY).

Airport information

Origin La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W
Destination London City Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LCY
ICAO Code: EGLC
Coordinates: 51°30′19″N, 0°3′19″E