Air Miles Calculator logo

How far is Lopez, WA, from Dawson Creek?

The distance between Dawson Creek (Dawson Creek Airport) and Lopez (Lopez Island Airport) is 515 miles / 829 kilometers / 448 nautical miles.

The driving distance from Dawson Creek (YDQ) to Lopez (LPS) is 787 miles / 1266 kilometers, and travel time by car is about 17 hours 16 minutes.

Dawson Creek Airport – Lopez Island Airport

Distance arrow
515
Miles
Distance arrow
829
Kilometers
Distance arrow
448
Nautical miles

Search flights

Distance from Dawson Creek to Lopez

There are several ways to calculate the distance from Dawson Creek to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 515.254 miles
  • 829.220 kilometers
  • 447.743 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 514.844 miles
  • 828.561 kilometers
  • 447.387 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Dawson Creek to Lopez?

The estimated flight time from Dawson Creek Airport to Lopez Island Airport is 1 hour and 28 minutes.

Flight carbon footprint between Dawson Creek Airport (YDQ) and Lopez Island Airport (LPS)

On average, flying from Dawson Creek to Lopez generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Dawson Creek to Lopez

See the map of the shortest flight path between Dawson Creek Airport (YDQ) and Lopez Island Airport (LPS).

Airport information

Origin Dawson Creek Airport
City: Dawson Creek
Country: Canada Flag of Canada
IATA Code: YDQ
ICAO Code: CYDQ
Coordinates: 55°44′32″N, 120°10′58″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W