Air Miles Calculator logo

How far is Lubbock, TX, from Banjul?

The distance between Banjul (Banjul International Airport) and Lubbock (Lubbock Preston Smith International Airport) is 5444 miles / 8761 kilometers / 4731 nautical miles.

Banjul International Airport – Lubbock Preston Smith International Airport

Distance arrow
5444
Miles
Distance arrow
8761
Kilometers
Distance arrow
4731
Nautical miles

Search flights

Distance from Banjul to Lubbock

There are several ways to calculate the distance from Banjul to Lubbock. Here are two standard methods:

Vincenty's formula (applied above)
  • 5444.111 miles
  • 8761.447 kilometers
  • 4730.803 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5437.208 miles
  • 8750.337 kilometers
  • 4724.804 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Banjul to Lubbock?

The estimated flight time from Banjul International Airport to Lubbock Preston Smith International Airport is 10 hours and 48 minutes.

Flight carbon footprint between Banjul International Airport (BJL) and Lubbock Preston Smith International Airport (LBB)

On average, flying from Banjul to Lubbock generates about 642 kg of CO2 per passenger, and 642 kilograms equals 1 416 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Banjul to Lubbock

See the map of the shortest flight path between Banjul International Airport (BJL) and Lubbock Preston Smith International Airport (LBB).

Airport information

Origin Banjul International Airport
City: Banjul
Country: Gambia Flag of Gambia
IATA Code: BJL
ICAO Code: GBYD
Coordinates: 13°20′16″N, 16°39′7″W
Destination Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W