How far is San Juan from Georgetown?
The distance between Georgetown (Owen Roberts International Airport) and San Juan (Fernando Luis Ribas Dominicci Airport) is 1000 miles / 1610 kilometers / 869 nautical miles.
Owen Roberts International Airport – Fernando Luis Ribas Dominicci Airport
Search flights
Distance from Georgetown to San Juan
There are several ways to calculate the distance from Georgetown to San Juan. Here are two standard methods:
Vincenty's formula (applied above)- 1000.441 miles
- 1610.053 kilometers
- 869.359 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 998.992 miles
- 1607.723 kilometers
- 868.101 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Georgetown to San Juan?
The estimated flight time from Owen Roberts International Airport to Fernando Luis Ribas Dominicci Airport is 2 hours and 23 minutes.
What is the time difference between Georgetown and San Juan?
Flight carbon footprint between Owen Roberts International Airport (GCM) and Fernando Luis Ribas Dominicci Airport (SIG)
On average, flying from Georgetown to San Juan generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Georgetown to San Juan
See the map of the shortest flight path between Owen Roberts International Airport (GCM) and Fernando Luis Ribas Dominicci Airport (SIG).
Airport information
Origin | Owen Roberts International Airport |
---|---|
City: | Georgetown |
Country: | Cayman Islands |
IATA Code: | GCM |
ICAO Code: | MWCR |
Coordinates: | 19°17′34″N, 81°21′27″W |
Destination | Fernando Luis Ribas Dominicci Airport |
---|---|
City: | San Juan |
Country: | Puerto Rico |
IATA Code: | SIG |
ICAO Code: | TJIG |
Coordinates: | 18°27′24″N, 66°5′53″W |