Air Miles Calculator logo

How far is Christiansted from Boston, MA?

The distance between Boston (Logan International Airport) and Christiansted (Christiansted Harbor Seaplane Base) is 1736 miles / 2794 kilometers / 1509 nautical miles.

Logan International Airport – Christiansted Harbor Seaplane Base

Distance arrow
1736
Miles
Distance arrow
2794
Kilometers
Distance arrow
1509
Nautical miles

Search flights

Distance from Boston to Christiansted

There are several ways to calculate the distance from Boston to Christiansted. Here are two standard methods:

Vincenty's formula (applied above)
  • 1736.009 miles
  • 2793.836 kilometers
  • 1508.551 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1740.832 miles
  • 2801.598 kilometers
  • 1512.742 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Boston to Christiansted?

The estimated flight time from Logan International Airport to Christiansted Harbor Seaplane Base is 3 hours and 47 minutes.

Flight carbon footprint between Logan International Airport (BOS) and Christiansted Harbor Seaplane Base (SSB)

On average, flying from Boston to Christiansted generates about 195 kg of CO2 per passenger, and 195 kilograms equals 430 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Boston to Christiansted

See the map of the shortest flight path between Logan International Airport (BOS) and Christiansted Harbor Seaplane Base (SSB).

Airport information

Origin Logan International Airport
City: Boston, MA
Country: United States Flag of United States
IATA Code: BOS
ICAO Code: KBOS
Coordinates: 42°21′51″N, 71°0′18″W
Destination Christiansted Harbor Seaplane Base
City: Christiansted
Country: U.S. Virgin Islands Flag of U.S. Virgin Islands
IATA Code: SSB
ICAO Code: VI32
Coordinates: 17°44′49″N, 64°42′17″W