Air Miles Calculator logo

How far is Salt Cay from Barcelona?

The distance between Barcelona (General José Antonio Anzoátegui International Airport) and Salt Cay (Salt Cay Airport) is 885 miles / 1424 kilometers / 769 nautical miles.

General José Antonio Anzoátegui International Airport – Salt Cay Airport

Distance arrow
885
Miles
Distance arrow
1424
Kilometers
Distance arrow
769
Nautical miles

Search flights

Distance from Barcelona to Salt Cay

There are several ways to calculate the distance from Barcelona to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 884.924 miles
  • 1424.147 kilometers
  • 768.978 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 887.899 miles
  • 1428.936 kilometers
  • 771.563 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barcelona to Salt Cay?

The estimated flight time from General José Antonio Anzoátegui International Airport to Salt Cay Airport is 2 hours and 10 minutes.

Flight carbon footprint between General José Antonio Anzoátegui International Airport (BLA) and Salt Cay Airport (SLX)

On average, flying from Barcelona to Salt Cay generates about 142 kg of CO2 per passenger, and 142 kilograms equals 314 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Barcelona to Salt Cay

See the map of the shortest flight path between General José Antonio Anzoátegui International Airport (BLA) and Salt Cay Airport (SLX).

Airport information

Origin General José Antonio Anzoátegui International Airport
City: Barcelona
Country: Venezuela Flag of Venezuela
IATA Code: BLA
ICAO Code: SVBC
Coordinates: 10°6′25″N, 64°41′21″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W