Air Miles Calculator logo

How far is Salt Cay from Ashtabula, OH?

The distance between Ashtabula (Northeast Ohio Regional Airport) and Salt Cay (Salt Cay Airport) is 1514 miles / 2436 kilometers / 1315 nautical miles.

Northeast Ohio Regional Airport – Salt Cay Airport

Distance arrow
1514
Miles
Distance arrow
2436
Kilometers
Distance arrow
1315
Nautical miles

Search flights

Distance from Ashtabula to Salt Cay

There are several ways to calculate the distance from Ashtabula to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1513.549 miles
  • 2435.820 kilometers
  • 1315.238 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1516.812 miles
  • 2441.073 kilometers
  • 1318.074 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ashtabula to Salt Cay?

The estimated flight time from Northeast Ohio Regional Airport to Salt Cay Airport is 3 hours and 21 minutes.

What is the time difference between Ashtabula and Salt Cay?

There is no time difference between Ashtabula and Salt Cay.

Flight carbon footprint between Northeast Ohio Regional Airport (JFN) and Salt Cay Airport (SLX)

On average, flying from Ashtabula to Salt Cay generates about 180 kg of CO2 per passenger, and 180 kilograms equals 398 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ashtabula to Salt Cay

See the map of the shortest flight path between Northeast Ohio Regional Airport (JFN) and Salt Cay Airport (SLX).

Airport information

Origin Northeast Ohio Regional Airport
City: Ashtabula, OH
Country: United States Flag of United States
IATA Code: JFN
ICAO Code: KHZY
Coordinates: 41°46′40″N, 80°41′43″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W