Air Miles Calculator logo

How far is Salt Cay from Saginaw, MI?

The distance between Saginaw (Saginaw MBS International Airport) and Salt Cay (Salt Cay Airport) is 1700 miles / 2736 kilometers / 1477 nautical miles.

Saginaw MBS International Airport – Salt Cay Airport

Distance arrow
1700
Miles
Distance arrow
2736
Kilometers
Distance arrow
1477
Nautical miles

Search flights

Distance from Saginaw to Salt Cay

There are several ways to calculate the distance from Saginaw to Salt Cay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1700.082 miles
  • 2736.017 kilometers
  • 1477.331 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1703.072 miles
  • 2740.829 kilometers
  • 1479.929 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saginaw to Salt Cay?

The estimated flight time from Saginaw MBS International Airport to Salt Cay Airport is 3 hours and 43 minutes.

What is the time difference between Saginaw and Salt Cay?

There is no time difference between Saginaw and Salt Cay.

Flight carbon footprint between Saginaw MBS International Airport (MBS) and Salt Cay Airport (SLX)

On average, flying from Saginaw to Salt Cay generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Saginaw to Salt Cay

See the map of the shortest flight path between Saginaw MBS International Airport (MBS) and Salt Cay Airport (SLX).

Airport information

Origin Saginaw MBS International Airport
City: Saginaw, MI
Country: United States Flag of United States
IATA Code: MBS
ICAO Code: KMBS
Coordinates: 43°31′58″N, 84°4′46″W
Destination Salt Cay Airport
City: Salt Cay
Country: Turks and Caicos Islands Flag of Turks and Caicos Islands
IATA Code: SLX
ICAO Code: MBSY
Coordinates: 21°19′58″N, 71°11′59″W