Air Miles Calculator logo

How far is Salt Cay from Concord, NC?

The distance between Concord (Concord-Padgett Regional Airport) and Salt Cay (Salt Cay Airport) is 1126 miles / 1813 kilometers / 979 nautical miles.

Concord-Padgett Regional Airport – Salt Cay Airport

Distance arrow
1126
Miles
Distance arrow
1813
Kilometers
Distance arrow
979
Nautical miles

Search flights

Distance from Concord to Salt Cay

There are several ways to calculate the distance from Concord to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1126.499 miles
  • 1812.925 kilometers
  • 978.901 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1128.701 miles
  • 1816.468 kilometers
  • 980.815 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Concord to Salt Cay?

The estimated flight time from Concord-Padgett Regional Airport to Salt Cay Airport is 2 hours and 37 minutes.

What is the time difference between Concord and Salt Cay?

There is no time difference between Concord and Salt Cay.

Flight carbon footprint between Concord-Padgett Regional Airport (USA) and Salt Cay Airport (SLX)

On average, flying from Concord to Salt Cay generates about 158 kg of CO2 per passenger, and 158 kilograms equals 349 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Concord to Salt Cay

See the map of the shortest flight path between Concord-Padgett Regional Airport (USA) and Salt Cay Airport (SLX).

Airport information

Origin Concord-Padgett Regional Airport
City: Concord, NC
Country: United States Flag of United States
IATA Code: USA
ICAO Code: KJQF
Coordinates: 35°23′16″N, 80°42′32″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W