Air Miles Calculator logo

How far is Spring Point from Las Vegas, NV?

The distance between Las Vegas (Las Vegas Harry Reid International Airport) and Spring Point (Spring Point Airport) is 2635 miles / 4241 kilometers / 2290 nautical miles.

Las Vegas Harry Reid International Airport – Spring Point Airport

Distance arrow
2635
Miles
Distance arrow
4241
Kilometers
Distance arrow
2290
Nautical miles

Search flights

Distance from Las Vegas to Spring Point

There are several ways to calculate the distance from Las Vegas to Spring Point. Here are two standard methods:

Vincenty's formula (applied above)
  • 2635.236 miles
  • 4241.001 kilometers
  • 2289.957 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2631.896 miles
  • 4235.627 kilometers
  • 2287.055 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Las Vegas to Spring Point?

The estimated flight time from Las Vegas Harry Reid International Airport to Spring Point Airport is 5 hours and 29 minutes.

Flight carbon footprint between Las Vegas Harry Reid International Airport (LAS) and Spring Point Airport (AXP)

On average, flying from Las Vegas to Spring Point generates about 291 kg of CO2 per passenger, and 291 kilograms equals 642 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Las Vegas to Spring Point

See the map of the shortest flight path between Las Vegas Harry Reid International Airport (LAS) and Spring Point Airport (AXP).

Airport information

Origin Las Vegas Harry Reid International Airport
City: Las Vegas, NV
Country: United States Flag of United States
IATA Code: LAS
ICAO Code: KLAS
Coordinates: 36°4′48″N, 115°9′7″W
Destination Spring Point Airport
City: Spring Point
Country: Bahamas Flag of Bahamas
IATA Code: AXP
ICAO Code: MYAP
Coordinates: 22°26′30″N, 73°58′15″W