Air Miles Calculator logo

How far is Lubbock, TX, from Fort St.John?

The distance between Fort St.John (Fort St. John Airport) and Lubbock (Lubbock Preston Smith International Airport) is 1800 miles / 2898 kilometers / 1565 nautical miles.

The driving distance from Fort St.John (YXJ) to Lubbock (LBB) is 2197 miles / 3535 kilometers, and travel time by car is about 41 hours 10 minutes.

Fort St. John Airport – Lubbock Preston Smith International Airport

Distance arrow
1800
Miles
Distance arrow
2898
Kilometers
Distance arrow
1565
Nautical miles

Search flights

Distance from Fort St.John to Lubbock

There are several ways to calculate the distance from Fort St.John to Lubbock. Here are two standard methods:

Vincenty's formula (applied above)
  • 1800.453 miles
  • 2897.549 kilometers
  • 1564.551 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1799.997 miles
  • 2896.814 kilometers
  • 1564.155 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort St.John to Lubbock?

The estimated flight time from Fort St. John Airport to Lubbock Preston Smith International Airport is 3 hours and 54 minutes.

Flight carbon footprint between Fort St. John Airport (YXJ) and Lubbock Preston Smith International Airport (LBB)

On average, flying from Fort St.John to Lubbock generates about 200 kg of CO2 per passenger, and 200 kilograms equals 441 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fort St.John to Lubbock

See the map of the shortest flight path between Fort St. John Airport (YXJ) and Lubbock Preston Smith International Airport (LBB).

Airport information

Origin Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W
Destination Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W