Air Miles Calculator logo

How far is Salt Cay from Kingston?

The distance between Kingston (Norman Manley International Airport) and Salt Cay (Salt Cay Airport) is 433 miles / 696 kilometers / 376 nautical miles.

Norman Manley International Airport – Salt Cay Airport

Distance arrow
433
Miles
Distance arrow
696
Kilometers
Distance arrow
376
Nautical miles

Search flights

Distance from Kingston to Salt Cay

There are several ways to calculate the distance from Kingston to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 432.613 miles
  • 696.223 kilometers
  • 375.931 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 432.721 miles
  • 696.397 kilometers
  • 376.024 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingston to Salt Cay?

The estimated flight time from Norman Manley International Airport to Salt Cay Airport is 1 hour and 19 minutes.

What is the time difference between Kingston and Salt Cay?

There is no time difference between Kingston and Salt Cay.

Flight carbon footprint between Norman Manley International Airport (KIN) and Salt Cay Airport (SLX)

On average, flying from Kingston to Salt Cay generates about 89 kg of CO2 per passenger, and 89 kilograms equals 196 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kingston to Salt Cay

See the map of the shortest flight path between Norman Manley International Airport (KIN) and Salt Cay Airport (SLX).

Airport information

Origin Norman Manley International Airport
City: Kingston
Country: Jamaica Flag of Jamaica
IATA Code: KIN
ICAO Code: MKJP
Coordinates: 17°56′8″N, 76°47′14″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W