Air Miles Calculator logo

How far is Santa Cruz De La Palma from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and Santa Cruz De La Palma (La Palma Airport) is 2789 miles / 4488 kilometers / 2423 nautical miles.

L.F. Wade International Airport – La Palma Airport

Distance arrow
2789
Miles
Distance arrow
4488
Kilometers
Distance arrow
2423
Nautical miles

Search flights

Distance from Hamilton to Santa Cruz De La Palma

There are several ways to calculate the distance from Hamilton to Santa Cruz De La Palma. Here are two standard methods:

Vincenty's formula (applied above)
  • 2788.872 miles
  • 4488.255 kilometers
  • 2423.464 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2783.478 miles
  • 4479.573 kilometers
  • 2418.776 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to Santa Cruz De La Palma?

The estimated flight time from L.F. Wade International Airport to La Palma Airport is 5 hours and 46 minutes.

Flight carbon footprint between L.F. Wade International Airport (BDA) and La Palma Airport (SPC)

On average, flying from Hamilton to Santa Cruz De La Palma generates about 309 kg of CO2 per passenger, and 309 kilograms equals 682 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to Santa Cruz De La Palma

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and La Palma Airport (SPC).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W