How far is Charlotte Amalie from Rome?
The distance between Rome (Leonardo da Vinci–Fiumicino Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 4739 miles / 7627 kilometers / 4118 nautical miles.
Leonardo da Vinci–Fiumicino Airport – Charlotte Amalie Harbor Seaplane Base
Search flights
Distance from Rome to Charlotte Amalie
There are several ways to calculate the distance from Rome to Charlotte Amalie. Here are two standard methods:
Vincenty's formula (applied above)- 4738.949 miles
- 7626.600 kilometers
- 4118.034 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4732.506 miles
- 7616.231 kilometers
- 4112.436 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Rome to Charlotte Amalie?
The estimated flight time from Leonardo da Vinci–Fiumicino Airport to Charlotte Amalie Harbor Seaplane Base is 9 hours and 28 minutes.
What is the time difference between Rome and Charlotte Amalie?
Flight carbon footprint between Leonardo da Vinci–Fiumicino Airport (FCO) and Charlotte Amalie Harbor Seaplane Base (SPB)
On average, flying from Rome to Charlotte Amalie generates about 550 kg of CO2 per passenger, and 550 kilograms equals 1 213 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Rome to Charlotte Amalie
See the map of the shortest flight path between Leonardo da Vinci–Fiumicino Airport (FCO) and Charlotte Amalie Harbor Seaplane Base (SPB).
Airport information
Origin | Leonardo da Vinci–Fiumicino Airport |
---|---|
City: | Rome |
Country: | Italy |
IATA Code: | FCO |
ICAO Code: | LIRF |
Coordinates: | 41°48′16″N, 12°15′2″E |
Destination | Charlotte Amalie Harbor Seaplane Base |
---|---|
City: | Charlotte Amalie |
Country: | U.S. Virgin Islands |
IATA Code: | SPB |
ICAO Code: | VI22 |
Coordinates: | 18°20′18″N, 64°56′26″W |