Air Miles Calculator logo

How far is Salt Cay from Binghamton, NY?

The distance between Binghamton (Greater Binghamton Airport) and Salt Cay (Salt Cay Airport) is 1465 miles / 2358 kilometers / 1273 nautical miles.

Greater Binghamton Airport – Salt Cay Airport

Distance arrow
1465
Miles
Distance arrow
2358
Kilometers
Distance arrow
1273
Nautical miles

Search flights

Distance from Binghamton to Salt Cay

There are several ways to calculate the distance from Binghamton to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1464.999 miles
  • 2357.688 kilometers
  • 1273.050 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1468.799 miles
  • 2363.803 kilometers
  • 1276.351 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Binghamton to Salt Cay?

The estimated flight time from Greater Binghamton Airport to Salt Cay Airport is 3 hours and 16 minutes.

What is the time difference between Binghamton and Salt Cay?

There is no time difference between Binghamton and Salt Cay.

Flight carbon footprint between Greater Binghamton Airport (BGM) and Salt Cay Airport (SLX)

On average, flying from Binghamton to Salt Cay generates about 177 kg of CO2 per passenger, and 177 kilograms equals 391 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Binghamton to Salt Cay

See the map of the shortest flight path between Greater Binghamton Airport (BGM) and Salt Cay Airport (SLX).

Airport information

Origin Greater Binghamton Airport
City: Binghamton, NY
Country: United States Flag of United States
IATA Code: BGM
ICAO Code: KBGM
Coordinates: 42°12′31″N, 75°58′47″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W