Air Miles Calculator logo

How far is Lubbock, TX, from Uranium City?

The distance between Uranium City (Uranium City Airport) and Lubbock (Lubbock Preston Smith International Airport) is 1815 miles / 2920 kilometers / 1577 nautical miles.

The driving distance from Uranium City (YBE) to Lubbock (LBB) is 2369 miles / 3813 kilometers, and travel time by car is about 56 hours 10 minutes.

Uranium City Airport – Lubbock Preston Smith International Airport

Distance arrow
1815
Miles
Distance arrow
2920
Kilometers
Distance arrow
1577
Nautical miles

Search flights

Distance from Uranium City to Lubbock

There are several ways to calculate the distance from Uranium City to Lubbock. Here are two standard methods:

Vincenty's formula (applied above)
  • 1814.597 miles
  • 2920.311 kilometers
  • 1576.842 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1814.972 miles
  • 2920.914 kilometers
  • 1577.168 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Uranium City to Lubbock?

The estimated flight time from Uranium City Airport to Lubbock Preston Smith International Airport is 3 hours and 56 minutes.

What is the time difference between Uranium City and Lubbock?

There is no time difference between Uranium City and Lubbock.

Flight carbon footprint between Uranium City Airport (YBE) and Lubbock Preston Smith International Airport (LBB)

On average, flying from Uranium City to Lubbock generates about 201 kg of CO2 per passenger, and 201 kilograms equals 444 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Uranium City to Lubbock

See the map of the shortest flight path between Uranium City Airport (YBE) and Lubbock Preston Smith International Airport (LBB).

Airport information

Origin Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W
Destination Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W