Air Miles Calculator logo

How far is Lopez, WA, from Puebla?

The distance between Puebla (Puebla International Airport) and Lopez (Lopez Island Airport) is 2443 miles / 3932 kilometers / 2123 nautical miles.

The driving distance from Puebla (PBC) to Lopez (LPS) is 2973 miles / 4784 kilometers, and travel time by car is about 57 hours 32 minutes.

Puebla International Airport – Lopez Island Airport

Distance arrow
2443
Miles
Distance arrow
3932
Kilometers
Distance arrow
2123
Nautical miles

Search flights

Distance from Puebla to Lopez

There are several ways to calculate the distance from Puebla to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2443.229 miles
  • 3931.996 kilometers
  • 2123.108 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2445.608 miles
  • 3935.825 kilometers
  • 2125.176 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Puebla to Lopez?

The estimated flight time from Puebla International Airport to Lopez Island Airport is 5 hours and 7 minutes.

Flight carbon footprint between Puebla International Airport (PBC) and Lopez Island Airport (LPS)

On average, flying from Puebla to Lopez generates about 269 kg of CO2 per passenger, and 269 kilograms equals 592 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Puebla to Lopez

See the map of the shortest flight path between Puebla International Airport (PBC) and Lopez Island Airport (LPS).

Airport information

Origin Puebla International Airport
City: Puebla
Country: Mexico Flag of Mexico
IATA Code: PBC
ICAO Code: MMPB
Coordinates: 19°9′29″N, 98°22′17″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W