Air Miles Calculator logo

How far is West Palm Beach, FL, from Georgetown?

The distance between Georgetown (Owen Roberts International Airport) and West Palm Beach (Palm Beach International Airport) is 515 miles / 829 kilometers / 447 nautical miles.

Owen Roberts International Airport – Palm Beach International Airport

Distance arrow
515
Miles
Distance arrow
829
Kilometers
Distance arrow
447
Nautical miles

Search flights

Distance from Georgetown to West Palm Beach

There are several ways to calculate the distance from Georgetown to West Palm Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 514.865 miles
  • 828.595 kilometers
  • 447.405 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 516.887 miles
  • 831.848 kilometers
  • 449.162 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Georgetown to West Palm Beach?

The estimated flight time from Owen Roberts International Airport to Palm Beach International Airport is 1 hour and 28 minutes.

What is the time difference between Georgetown and West Palm Beach?

There is no time difference between Georgetown and West Palm Beach.

Flight carbon footprint between Owen Roberts International Airport (GCM) and Palm Beach International Airport (PBI)

On average, flying from Georgetown to West Palm Beach generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Georgetown to West Palm Beach

See the map of the shortest flight path between Owen Roberts International Airport (GCM) and Palm Beach International Airport (PBI).

Airport information

Origin Owen Roberts International Airport
City: Georgetown
Country: Cayman Islands Flag of Cayman Islands
IATA Code: GCM
ICAO Code: MWCR
Coordinates: 19°17′34″N, 81°21′27″W
Destination Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W