Air Miles Calculator logo

How far is Charlotte Amalie from London?

The distance between London (London Heathrow Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 4145 miles / 6671 kilometers / 3602 nautical miles.

London Heathrow Airport – Charlotte Amalie Harbor Seaplane Base

Distance arrow
4145
Miles
Distance arrow
6671
Kilometers
Distance arrow
3602
Nautical miles

Search flights

Distance from London to Charlotte Amalie

There are several ways to calculate the distance from London to Charlotte Amalie. Here are two standard methods:

Vincenty's formula (applied above)
  • 4145.028 miles
  • 6670.775 kilometers
  • 3601.930 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4141.564 miles
  • 6665.202 kilometers
  • 3598.921 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to Charlotte Amalie?

The estimated flight time from London Heathrow Airport to Charlotte Amalie Harbor Seaplane Base is 8 hours and 20 minutes.

Flight carbon footprint between London Heathrow Airport (LHR) and Charlotte Amalie Harbor Seaplane Base (SPB)

On average, flying from London to Charlotte Amalie generates about 475 kg of CO2 per passenger, and 475 kilograms equals 1 046 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from London to Charlotte Amalie

See the map of the shortest flight path between London Heathrow Airport (LHR) and Charlotte Amalie Harbor Seaplane Base (SPB).

Airport information

Origin London Heathrow Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LHR
ICAO Code: EGLL
Coordinates: 51°28′14″N, 0°27′42″W
Destination Charlotte Amalie Harbor Seaplane Base
City: Charlotte Amalie
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SPB
ICAO Code: VI22
Coordinates: 18°20′18″N, 64°56′26″W