Air Miles Calculator logo

How far is Fort St.John from Lopez, WA?

The distance between Lopez (Lopez Island Airport) and Fort St.John (Fort St. John Airport) is 544 miles / 876 kilometers / 473 nautical miles.

The driving distance from Lopez (LPS) to Fort St.John (YXJ) is 801 miles / 1289 kilometers, and travel time by car is about 17 hours 41 minutes.

Lopez Island Airport – Fort St. John Airport

Distance arrow
544
Miles
Distance arrow
876
Kilometers
Distance arrow
473
Nautical miles

Search flights

Distance from Lopez to Fort St.John

There are several ways to calculate the distance from Lopez to Fort St.John. Here are two standard methods:

Vincenty's formula (applied above)
  • 544.083 miles
  • 875.616 kilometers
  • 472.795 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 543.658 miles
  • 874.933 kilometers
  • 472.426 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lopez to Fort St.John?

The estimated flight time from Lopez Island Airport to Fort St. John Airport is 1 hour and 31 minutes.

Flight carbon footprint between Lopez Island Airport (LPS) and Fort St. John Airport (YXJ)

On average, flying from Lopez to Fort St.John generates about 105 kg of CO2 per passenger, and 105 kilograms equals 232 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lopez to Fort St.John

See the map of the shortest flight path between Lopez Island Airport (LPS) and Fort St. John Airport (YXJ).

Airport information

Origin Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W
Destination Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W