How far is Salt Cay from Panama City Beach, FL?
The distance between Panama City Beach (Northwest Florida Beaches International Airport) and Salt Cay (Salt Cay Airport) is 1099 miles / 1769 kilometers / 955 nautical miles.
Northwest Florida Beaches International Airport – Salt Cay Airport
Search flights
Distance from Panama City Beach to Salt Cay
There are several ways to calculate the distance from Panama City Beach to Salt Cay. Here are two standard methods:
Vincenty's formula (applied above)- 1098.977 miles
- 1768.632 kilometers
- 954.985 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1098.956 miles
- 1768.598 kilometers
- 954.966 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Panama City Beach to Salt Cay?
The estimated flight time from Northwest Florida Beaches International Airport to Salt Cay Airport is 2 hours and 34 minutes.
What is the time difference between Panama City Beach and Salt Cay?
Flight carbon footprint between Northwest Florida Beaches International Airport (ECP) and Salt Cay Airport (SLX)
On average, flying from Panama City Beach to Salt Cay generates about 157 kg of CO2 per passenger, and 157 kilograms equals 345 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Panama City Beach to Salt Cay
See the map of the shortest flight path between Northwest Florida Beaches International Airport (ECP) and Salt Cay Airport (SLX).
Airport information
Origin | Northwest Florida Beaches International Airport |
---|---|
City: | Panama City Beach, FL |
Country: | United States |
IATA Code: | ECP |
ICAO Code: | KECP |
Coordinates: | 30°20′30″N, 85°47′50″W |
Destination | Salt Cay Airport |
---|---|
City: | Salt Cay |
Country: | Turks and Caicos Islands |
IATA Code: | SLX |
ICAO Code: | MBSY |
Coordinates: | 21°19′58″N, 71°11′59″W |