Air Miles Calculator logo

How far is Salt Cay from San Pedro Sula?

The distance between San Pedro Sula (Ramón Villeda Morales International Airport) and Salt Cay (Salt Cay Airport) is 1169 miles / 1882 kilometers / 1016 nautical miles.

Ramón Villeda Morales International Airport – Salt Cay Airport

Distance arrow
1169
Miles
Distance arrow
1882
Kilometers
Distance arrow
1016
Nautical miles

Search flights

Distance from San Pedro Sula to Salt Cay

There are several ways to calculate the distance from San Pedro Sula to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1169.200 miles
  • 1881.645 kilometers
  • 1016.007 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1168.351 miles
  • 1880.279 kilometers
  • 1015.269 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Pedro Sula to Salt Cay?

The estimated flight time from Ramón Villeda Morales International Airport to Salt Cay Airport is 2 hours and 42 minutes.

Flight carbon footprint between Ramón Villeda Morales International Airport (SAP) and Salt Cay Airport (SLX)

On average, flying from San Pedro Sula to Salt Cay generates about 160 kg of CO2 per passenger, and 160 kilograms equals 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Pedro Sula to Salt Cay

See the map of the shortest flight path between Ramón Villeda Morales International Airport (SAP) and Salt Cay Airport (SLX).

Airport information

Origin Ramón Villeda Morales International Airport
City: San Pedro Sula
Country: Honduras Flag of Honduras
IATA Code: SAP
ICAO Code: MHLM
Coordinates: 15°27′9″N, 87°55′24″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W