How far is Gillam from Lubbock, TX?
The distance between Lubbock (Lubbock Preston Smith International Airport) and Gillam (Gillam Airport) is 1603 miles / 2580 kilometers / 1393 nautical miles.
The driving distance from Lubbock (LBB) to Gillam (YGX) is 2023 miles / 3255 kilometers, and travel time by car is about 41 hours 0 minutes.
Lubbock Preston Smith International Airport – Gillam Airport
Search flights
Distance from Lubbock to Gillam
There are several ways to calculate the distance from Lubbock to Gillam. Here are two standard methods:
Vincenty's formula (applied above)- 1603.343 miles
- 2580.331 kilometers
- 1393.267 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1604.015 miles
- 2581.412 kilometers
- 1393.851 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lubbock to Gillam?
The estimated flight time from Lubbock Preston Smith International Airport to Gillam Airport is 3 hours and 32 minutes.
What is the time difference between Lubbock and Gillam?
Flight carbon footprint between Lubbock Preston Smith International Airport (LBB) and Gillam Airport (YGX)
On average, flying from Lubbock to Gillam generates about 186 kg of CO2 per passenger, and 186 kilograms equals 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lubbock to Gillam
See the map of the shortest flight path between Lubbock Preston Smith International Airport (LBB) and Gillam Airport (YGX).
Airport information
Origin | Lubbock Preston Smith International Airport |
---|---|
City: | Lubbock, TX |
Country: | United States |
IATA Code: | LBB |
ICAO Code: | KLBB |
Coordinates: | 33°39′48″N, 101°49′22″W |
Destination | Gillam Airport |
---|---|
City: | Gillam |
Country: | Canada |
IATA Code: | YGX |
ICAO Code: | CYGX |
Coordinates: | 56°21′26″N, 94°42′38″W |