Air Miles Calculator logo

How far is Lopez, WA, from Silao?

The distance between Silao (Bajío International Airport) and Lopez (Lopez Island Airport) is 2237 miles / 3601 kilometers / 1944 nautical miles.

The driving distance from Silao (BJX) to Lopez (LPS) is 2697 miles / 4341 kilometers, and travel time by car is about 52 hours 29 minutes.

Bajío International Airport – Lopez Island Airport

Distance arrow
2237
Miles
Distance arrow
3601
Kilometers
Distance arrow
1944
Nautical miles

Search flights

Distance from Silao to Lopez

There are several ways to calculate the distance from Silao to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2237.317 miles
  • 3600.613 kilometers
  • 1944.176 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2239.584 miles
  • 3604.261 kilometers
  • 1946.145 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Silao to Lopez?

The estimated flight time from Bajío International Airport to Lopez Island Airport is 4 hours and 44 minutes.

Flight carbon footprint between Bajío International Airport (BJX) and Lopez Island Airport (LPS)

On average, flying from Silao to Lopez generates about 245 kg of CO2 per passenger, and 245 kilograms equals 539 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Silao to Lopez

See the map of the shortest flight path between Bajío International Airport (BJX) and Lopez Island Airport (LPS).

Airport information

Origin Bajío International Airport
City: Silao
Country: Mexico Flag of Mexico
IATA Code: BJX
ICAO Code: MMLO
Coordinates: 20°59′36″N, 101°28′51″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W