Air Miles Calculator logo

How far is Lethbridge from Silao?

The distance between Silao (Bajío International Airport) and Lethbridge (Lethbridge Airport) is 2070 miles / 3331 kilometers / 1799 nautical miles.

The driving distance from Silao (BJX) to Lethbridge (YQL) is 2521 miles / 4057 kilometers, and travel time by car is about 47 hours 35 minutes.

Bajío International Airport – Lethbridge Airport

Distance arrow
2070
Miles
Distance arrow
3331
Kilometers
Distance arrow
1799
Nautical miles

Search flights

Distance from Silao to Lethbridge

There are several ways to calculate the distance from Silao to Lethbridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 2070.012 miles
  • 3331.361 kilometers
  • 1798.791 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2073.690 miles
  • 3337.281 kilometers
  • 1801.988 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Silao to Lethbridge?

The estimated flight time from Bajío International Airport to Lethbridge Airport is 4 hours and 25 minutes.

Flight carbon footprint between Bajío International Airport (BJX) and Lethbridge Airport (YQL)

On average, flying from Silao to Lethbridge generates about 225 kg of CO2 per passenger, and 225 kilograms equals 497 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Silao to Lethbridge

See the map of the shortest flight path between Bajío International Airport (BJX) and Lethbridge Airport (YQL).

Airport information

Origin Bajío International Airport
City: Silao
Country: Mexico Flag of Mexico
IATA Code: BJX
ICAO Code: MMLO
Coordinates: 20°59′36″N, 101°28′51″W
Destination Lethbridge Airport
City: Lethbridge
Country: Canada Flag of Canada
IATA Code: YQL
ICAO Code: CYQL
Coordinates: 49°37′49″N, 112°48′0″W