Air Miles Calculator logo

How far is London from Santa Cruz De La Palma?

The distance between Santa Cruz De La Palma (La Palma Airport) and London (London Gatwick Airport) is 1802 miles / 2900 kilometers / 1566 nautical miles.

The driving distance from Santa Cruz De La Palma (SPC) to London (LGW) is 2389 miles / 3845 kilometers, and travel time by car is about 65 hours 2 minutes.

La Palma Airport – London Gatwick Airport

Distance arrow
1802
Miles
Distance arrow
2900
Kilometers
Distance arrow
1566
Nautical miles

Search flights

Distance from Santa Cruz De La Palma to London

There are several ways to calculate the distance from Santa Cruz De La Palma to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1802.242 miles
  • 2900.428 kilometers
  • 1566.106 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1803.035 miles
  • 2901.703 kilometers
  • 1566.794 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz De La Palma to London?

The estimated flight time from La Palma Airport to London Gatwick Airport is 3 hours and 54 minutes.

What is the time difference between Santa Cruz De La Palma and London?

There is no time difference between Santa Cruz De La Palma and London.

Flight carbon footprint between La Palma Airport (SPC) and London Gatwick Airport (LGW)

On average, flying from Santa Cruz De La Palma to London generates about 200 kg of CO2 per passenger, and 200 kilograms equals 441 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Cruz De La Palma to London

See the map of the shortest flight path between La Palma Airport (SPC) and London Gatwick Airport (LGW).

Airport information

Origin La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W
Destination London Gatwick Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LGW
ICAO Code: EGKK
Coordinates: 51°8′53″N, 0°11′25″W