Air Miles Calculator logo

How far is Charlotte Amalie from Birmingham?

The distance between Birmingham (Birmingham Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 4101 miles / 6600 kilometers / 3564 nautical miles.

Birmingham Airport – Charlotte Amalie Harbor Seaplane Base

Distance arrow
4101
Miles
Distance arrow
6600
Kilometers
Distance arrow
3564
Nautical miles

Search flights

Distance from Birmingham to Charlotte Amalie

There are several ways to calculate the distance from Birmingham to Charlotte Amalie. Here are two standard methods:

Vincenty's formula (applied above)
  • 4101.336 miles
  • 6600.460 kilometers
  • 3563.963 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4098.190 miles
  • 6595.397 kilometers
  • 3561.230 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Birmingham to Charlotte Amalie?

The estimated flight time from Birmingham Airport to Charlotte Amalie Harbor Seaplane Base is 8 hours and 15 minutes.

Flight carbon footprint between Birmingham Airport (BHX) and Charlotte Amalie Harbor Seaplane Base (SPB)

On average, flying from Birmingham to Charlotte Amalie generates about 469 kg of CO2 per passenger, and 469 kilograms equals 1 034 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Birmingham to Charlotte Amalie

See the map of the shortest flight path between Birmingham Airport (BHX) and Charlotte Amalie Harbor Seaplane Base (SPB).

Airport information

Origin Birmingham Airport
City: Birmingham
Country: United Kingdom Flag of United Kingdom
IATA Code: BHX
ICAO Code: EGBB
Coordinates: 52°27′14″N, 1°44′52″W
Destination Charlotte Amalie Harbor Seaplane Base
City: Charlotte Amalie
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SPB
ICAO Code: VI22
Coordinates: 18°20′18″N, 64°56′26″W