Air Miles Calculator logo

How far is Charlotte Amalie from Luxembourg?

The distance between Luxembourg (Luxembourg Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 4427 miles / 7124 kilometers / 3847 nautical miles.

Luxembourg Airport – Charlotte Amalie Harbor Seaplane Base

Distance arrow
4427
Miles
Distance arrow
7124
Kilometers
Distance arrow
3847
Nautical miles

Search flights

Distance from Luxembourg to Charlotte Amalie

There are several ways to calculate the distance from Luxembourg to Charlotte Amalie. Here are two standard methods:

Vincenty's formula (applied above)
  • 4426.804 miles
  • 7124.250 kilometers
  • 3846.787 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4422.136 miles
  • 7116.738 kilometers
  • 3842.731 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luxembourg to Charlotte Amalie?

The estimated flight time from Luxembourg Airport to Charlotte Amalie Harbor Seaplane Base is 8 hours and 52 minutes.

Flight carbon footprint between Luxembourg Airport (LUX) and Charlotte Amalie Harbor Seaplane Base (SPB)

On average, flying from Luxembourg to Charlotte Amalie generates about 510 kg of CO2 per passenger, and 510 kilograms equals 1 125 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Luxembourg to Charlotte Amalie

See the map of the shortest flight path between Luxembourg Airport (LUX) and Charlotte Amalie Harbor Seaplane Base (SPB).

Airport information

Origin Luxembourg Airport
City: Luxembourg
Country: Luxembourg Flag of Luxembourg
IATA Code: LUX
ICAO Code: ELLX
Coordinates: 49°37′35″N, 6°12′41″E
Destination Charlotte Amalie Harbor Seaplane Base
City: Charlotte Amalie
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SPB
ICAO Code: VI22
Coordinates: 18°20′18″N, 64°56′26″W