Air Miles Calculator logo

How far is Charlotte Amalie from Stella Maris?

The distance between Stella Maris (Stella Maris Airport) and Charlotte Amalie (Charlotte Amalie Harbor Seaplane Base) is 758 miles / 1220 kilometers / 659 nautical miles.

Stella Maris Airport – Charlotte Amalie Harbor Seaplane Base

Distance arrow
758
Miles
Distance arrow
1220
Kilometers
Distance arrow
659
Nautical miles

Search flights

Distance from Stella Maris to Charlotte Amalie

There are several ways to calculate the distance from Stella Maris to Charlotte Amalie. Here are two standard methods:

Vincenty's formula (applied above)
  • 758.264 miles
  • 1220.308 kilometers
  • 658.914 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 758.100 miles
  • 1220.044 kilometers
  • 658.771 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Stella Maris to Charlotte Amalie?

The estimated flight time from Stella Maris Airport to Charlotte Amalie Harbor Seaplane Base is 1 hour and 56 minutes.

Flight carbon footprint between Stella Maris Airport (SML) and Charlotte Amalie Harbor Seaplane Base (SPB)

On average, flying from Stella Maris to Charlotte Amalie generates about 131 kg of CO2 per passenger, and 131 kilograms equals 288 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Stella Maris to Charlotte Amalie

See the map of the shortest flight path between Stella Maris Airport (SML) and Charlotte Amalie Harbor Seaplane Base (SPB).

Airport information

Origin Stella Maris Airport
City: Stella Maris
Country: Bahamas Flag of Bahamas
IATA Code: SML
ICAO Code: MYLS
Coordinates: 23°34′56″N, 75°16′7″W
Destination Charlotte Amalie Harbor Seaplane Base
City: Charlotte Amalie
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SPB
ICAO Code: VI22
Coordinates: 18°20′18″N, 64°56′26″W