Air Miles Calculator logo

How far is Spring Point from Lubbock, TX?

The distance between Lubbock (Lubbock Preston Smith International Airport) and Spring Point (Spring Point Airport) is 1861 miles / 2994 kilometers / 1617 nautical miles.

Lubbock Preston Smith International Airport – Spring Point Airport

Distance arrow
1861
Miles
Distance arrow
2994
Kilometers
Distance arrow
1617
Nautical miles

Search flights

Distance from Lubbock to Spring Point

There are several ways to calculate the distance from Lubbock to Spring Point. Here are two standard methods:

Vincenty's formula (applied above)
  • 1860.669 miles
  • 2994.456 kilometers
  • 1616.877 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1858.890 miles
  • 2991.594 kilometers
  • 1615.332 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lubbock to Spring Point?

The estimated flight time from Lubbock Preston Smith International Airport to Spring Point Airport is 4 hours and 1 minutes.

Flight carbon footprint between Lubbock Preston Smith International Airport (LBB) and Spring Point Airport (AXP)

On average, flying from Lubbock to Spring Point generates about 205 kg of CO2 per passenger, and 205 kilograms equals 452 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lubbock to Spring Point

See the map of the shortest flight path between Lubbock Preston Smith International Airport (LBB) and Spring Point Airport (AXP).

Airport information

Origin Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W
Destination Spring Point Airport
City: Spring Point
Country: Bahamas Flag of Bahamas
IATA Code: AXP
ICAO Code: MYAP
Coordinates: 22°26′30″N, 73°58′15″W