Air Miles Calculator logo

How far is Lubbock, TX, from Prince George?

The distance between Prince George (Prince George Airport) and Lubbock (Lubbock Preston Smith International Airport) is 1729 miles / 2783 kilometers / 1503 nautical miles.

The driving distance from Prince George (YXS) to Lubbock (LBB) is 2111 miles / 3398 kilometers, and travel time by car is about 40 hours 45 minutes.

Prince George Airport – Lubbock Preston Smith International Airport

Distance arrow
1729
Miles
Distance arrow
2783
Kilometers
Distance arrow
1503
Nautical miles

Search flights

Distance from Prince George to Lubbock

There are several ways to calculate the distance from Prince George to Lubbock. Here are two standard methods:

Vincenty's formula (applied above)
  • 1729.099 miles
  • 2782.715 kilometers
  • 1502.546 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1728.365 miles
  • 2781.533 kilometers
  • 1501.908 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prince George to Lubbock?

The estimated flight time from Prince George Airport to Lubbock Preston Smith International Airport is 3 hours and 46 minutes.

Flight carbon footprint between Prince George Airport (YXS) and Lubbock Preston Smith International Airport (LBB)

On average, flying from Prince George to Lubbock generates about 195 kg of CO2 per passenger, and 195 kilograms equals 429 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Prince George to Lubbock

See the map of the shortest flight path between Prince George Airport (YXS) and Lubbock Preston Smith International Airport (LBB).

Airport information

Origin Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W
Destination Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W