How far is Charlotte Amalie from Columbia, SC?
The distance between Columbia (Columbia Metropolitan Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 1467 miles / 2361 kilometers / 1275 nautical miles.
Columbia Metropolitan Airport – Charlotte Amalie Harbor Seaplane Base
Search flights
Distance from Columbia to Charlotte Amalie
There are several ways to calculate the distance from Columbia to Charlotte Amalie. Here are two standard methods:
Vincenty's formula (applied above)- 1466.960 miles
- 2360.844 kilometers
- 1274.754 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1468.604 miles
- 2363.489 kilometers
- 1276.182 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Columbia to Charlotte Amalie?
The estimated flight time from Columbia Metropolitan Airport to Charlotte Amalie Harbor Seaplane Base is 3 hours and 16 minutes.
What is the time difference between Columbia and Charlotte Amalie?
Flight carbon footprint between Columbia Metropolitan Airport (CAE) and Charlotte Amalie Harbor Seaplane Base (SPB)
On average, flying from Columbia to Charlotte Amalie generates about 177 kg of CO2 per passenger, and 177 kilograms equals 391 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Columbia to Charlotte Amalie
See the map of the shortest flight path between Columbia Metropolitan Airport (CAE) and Charlotte Amalie Harbor Seaplane Base (SPB).
Airport information
Origin | Columbia Metropolitan Airport |
---|---|
City: | Columbia, SC |
Country: | United States |
IATA Code: | CAE |
ICAO Code: | KCAE |
Coordinates: | 33°56′19″N, 81°7′10″W |
Destination | Charlotte Amalie Harbor Seaplane Base |
---|---|
City: | Charlotte Amalie |
Country: | U.S. Virgin Islands |
IATA Code: | SPB |
ICAO Code: | VI22 |
Coordinates: | 18°20′18″N, 64°56′26″W |