How far is Charlotte Amalie from Kingston?
The distance between Kingston (Kingston Norman Rogers Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 1908 miles / 3070 kilometers / 1658 nautical miles.
Kingston Norman Rogers Airport – Charlotte Amalie Harbor Seaplane Base
Search flights
Distance from Kingston to Charlotte Amalie
There are several ways to calculate the distance from Kingston to Charlotte Amalie. Here are two standard methods:
Vincenty's formula (applied above)- 1907.650 miles
- 3070.064 kilometers
- 1657.702 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1911.860 miles
- 3076.841 kilometers
- 1661.361 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Kingston to Charlotte Amalie?
The estimated flight time from Kingston Norman Rogers Airport to Charlotte Amalie Harbor Seaplane Base is 4 hours and 6 minutes.
What is the time difference between Kingston and Charlotte Amalie?
Flight carbon footprint between Kingston Norman Rogers Airport (YGK) and Charlotte Amalie Harbor Seaplane Base (SPB)
On average, flying from Kingston to Charlotte Amalie generates about 209 kg of CO2 per passenger, and 209 kilograms equals 461 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Kingston to Charlotte Amalie
See the map of the shortest flight path between Kingston Norman Rogers Airport (YGK) and Charlotte Amalie Harbor Seaplane Base (SPB).
Airport information
Origin | Kingston Norman Rogers Airport |
---|---|
City: | Kingston |
Country: | Canada |
IATA Code: | YGK |
ICAO Code: | CYGK |
Coordinates: | 44°13′31″N, 76°35′48″W |
Destination | Charlotte Amalie Harbor Seaplane Base |
---|---|
City: | Charlotte Amalie |
Country: | U.S. Virgin Islands |
IATA Code: | SPB |
ICAO Code: | VI22 |
Coordinates: | 18°20′18″N, 64°56′26″W |