How far is Charlotte Amalie from Norfolk, VA?
The distance between Norfolk (Norfolk International Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 1450 miles / 2333 kilometers / 1260 nautical miles.
Norfolk International Airport – Charlotte Amalie Harbor Seaplane Base
Search flights
Distance from Norfolk to Charlotte Amalie
There are several ways to calculate the distance from Norfolk to Charlotte Amalie. Here are two standard methods:
Vincenty's formula (applied above)- 1449.728 miles
- 2333.110 kilometers
- 1259.779 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1452.959 miles
- 2338.311 kilometers
- 1262.587 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Norfolk to Charlotte Amalie?
The estimated flight time from Norfolk International Airport to Charlotte Amalie Harbor Seaplane Base is 3 hours and 14 minutes.
What is the time difference between Norfolk and Charlotte Amalie?
Flight carbon footprint between Norfolk International Airport (ORF) and Charlotte Amalie Harbor Seaplane Base (SPB)
On average, flying from Norfolk to Charlotte Amalie generates about 176 kg of CO2 per passenger, and 176 kilograms equals 389 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Norfolk to Charlotte Amalie
See the map of the shortest flight path between Norfolk International Airport (ORF) and Charlotte Amalie Harbor Seaplane Base (SPB).
Airport information
Origin | Norfolk International Airport |
---|---|
City: | Norfolk, VA |
Country: | United States |
IATA Code: | ORF |
ICAO Code: | KORF |
Coordinates: | 36°53′40″N, 76°12′4″W |
Destination | Charlotte Amalie Harbor Seaplane Base |
---|---|
City: | Charlotte Amalie |
Country: | U.S. Virgin Islands |
IATA Code: | SPB |
ICAO Code: | VI22 |
Coordinates: | 18°20′18″N, 64°56′26″W |