Air Miles Calculator logo

How far is Santa Cruz De La Palma from Nairobi?

The distance between Nairobi (Jomo Kenyatta International Airport) and Santa Cruz De La Palma (La Palma Airport) is 4162 miles / 6698 kilometers / 3617 nautical miles.

Jomo Kenyatta International Airport – La Palma Airport

Distance arrow
4162
Miles
Distance arrow
6698
Kilometers
Distance arrow
3617
Nautical miles

Search flights

Distance from Nairobi to Santa Cruz De La Palma

There are several ways to calculate the distance from Nairobi to Santa Cruz De La Palma. Here are two standard methods:

Vincenty's formula (applied above)
  • 4161.978 miles
  • 6698.055 kilometers
  • 3616.660 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4162.662 miles
  • 6699.156 kilometers
  • 3617.255 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nairobi to Santa Cruz De La Palma?

The estimated flight time from Jomo Kenyatta International Airport to La Palma Airport is 8 hours and 22 minutes.

Flight carbon footprint between Jomo Kenyatta International Airport (NBO) and La Palma Airport (SPC)

On average, flying from Nairobi to Santa Cruz De La Palma generates about 477 kg of CO2 per passenger, and 477 kilograms equals 1 051 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nairobi to Santa Cruz De La Palma

See the map of the shortest flight path between Jomo Kenyatta International Airport (NBO) and La Palma Airport (SPC).

Airport information

Origin Jomo Kenyatta International Airport
City: Nairobi
Country: Kenya Flag of Kenya
IATA Code: NBO
ICAO Code: HKJK
Coordinates: 1°19′9″S, 36°55′40″E
Destination La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W