Air Miles Calculator logo

How far is Dawson Creek from Lopez, WA?

The distance between Lopez (Lopez Island Airport) and Dawson Creek (Dawson Creek Airport) is 515 miles / 829 kilometers / 448 nautical miles.

The driving distance from Lopez (LPS) to Dawson Creek (YDQ) is 786 miles / 1265 kilometers, and travel time by car is about 17 hours 14 minutes.

Lopez Island Airport – Dawson Creek Airport

Distance arrow
515
Miles
Distance arrow
829
Kilometers
Distance arrow
448
Nautical miles

Search flights

Distance from Lopez to Dawson Creek

There are several ways to calculate the distance from Lopez to Dawson Creek. Here are two standard methods:

Vincenty's formula (applied above)
  • 515.254 miles
  • 829.220 kilometers
  • 447.743 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 514.844 miles
  • 828.561 kilometers
  • 447.387 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lopez to Dawson Creek?

The estimated flight time from Lopez Island Airport to Dawson Creek Airport is 1 hour and 28 minutes.

Flight carbon footprint between Lopez Island Airport (LPS) and Dawson Creek Airport (YDQ)

On average, flying from Lopez to Dawson Creek generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lopez to Dawson Creek

See the map of the shortest flight path between Lopez Island Airport (LPS) and Dawson Creek Airport (YDQ).

Airport information

Origin Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W
Destination Dawson Creek Airport
City: Dawson Creek
Country: Canada Flag of Canada
IATA Code: YDQ
ICAO Code: CYDQ
Coordinates: 55°44′32″N, 120°10′58″W