Air Miles Calculator logo

How far is Charlotte Amalie from London?

The distance between London (London Gatwick Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 4154 miles / 6685 kilometers / 3609 nautical miles.

London Gatwick Airport – Charlotte Amalie Harbor Seaplane Base

Distance arrow
4154
Miles
Distance arrow
6685
Kilometers
Distance arrow
3609
Nautical miles

Search flights

Distance from London to Charlotte Amalie

There are several ways to calculate the distance from London to Charlotte Amalie. Here are two standard methods:

Vincenty's formula (applied above)
  • 4153.617 miles
  • 6684.598 kilometers
  • 3609.394 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4150.071 miles
  • 6678.892 kilometers
  • 3606.313 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to Charlotte Amalie?

The estimated flight time from London Gatwick Airport to Charlotte Amalie Harbor Seaplane Base is 8 hours and 21 minutes.

Flight carbon footprint between London Gatwick Airport (LGW) and Charlotte Amalie Harbor Seaplane Base (SPB)

On average, flying from London to Charlotte Amalie generates about 476 kg of CO2 per passenger, and 476 kilograms equals 1 048 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from London to Charlotte Amalie

See the map of the shortest flight path between London Gatwick Airport (LGW) and Charlotte Amalie Harbor Seaplane Base (SPB).

Airport information

Origin London Gatwick Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LGW
ICAO Code: EGKK
Coordinates: 51°8′53″N, 0°11′25″W
Destination Charlotte Amalie Harbor Seaplane Base
City: Charlotte Amalie
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SPB
ICAO Code: VI22
Coordinates: 18°20′18″N, 64°56′26″W