Air Miles Calculator logo

How far is Banja Luka from Santa Cruz De La Palma?

The distance between Santa Cruz De La Palma (La Palma Airport) and Banja Luka (Banja Luka International Airport) is 2221 miles / 3575 kilometers / 1930 nautical miles.

The driving distance from Santa Cruz De La Palma (SPC) to Banja Luka (BNX) is 2760 miles / 4441 kilometers, and travel time by car is about 71 hours 15 minutes.

La Palma Airport – Banja Luka International Airport

Distance arrow
2221
Miles
Distance arrow
3575
Kilometers
Distance arrow
1930
Nautical miles

Search flights

Distance from Santa Cruz De La Palma to Banja Luka

There are several ways to calculate the distance from Santa Cruz De La Palma to Banja Luka. Here are two standard methods:

Vincenty's formula (applied above)
  • 2221.188 miles
  • 3574.656 kilometers
  • 1930.160 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2218.509 miles
  • 3570.345 kilometers
  • 1927.832 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz De La Palma to Banja Luka?

The estimated flight time from La Palma Airport to Banja Luka International Airport is 4 hours and 42 minutes.

Flight carbon footprint between La Palma Airport (SPC) and Banja Luka International Airport (BNX)

On average, flying from Santa Cruz De La Palma to Banja Luka generates about 243 kg of CO2 per passenger, and 243 kilograms equals 535 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Cruz De La Palma to Banja Luka

See the map of the shortest flight path between La Palma Airport (SPC) and Banja Luka International Airport (BNX).

Airport information

Origin La Palma Airport
City: Santa Cruz De La Palma
Country: Spain Flag of Spain
IATA Code: SPC
ICAO Code: GCLA
Coordinates: 28°37′35″N, 17°45′20″W
Destination Banja Luka International Airport
City: Banja Luka
Country: Bosnia and Herzegovina Flag of Bosnia and Herzegovina
IATA Code: BNX
ICAO Code: LQBK
Coordinates: 44°56′29″N, 17°17′51″E