How far is Lopez, WA, from Saint John?
The distance between Saint John (Saint John Airport) and Lopez (Lopez Island Airport) is 2647 miles / 4259 kilometers / 2300 nautical miles.
The driving distance from Saint John (YSJ) to Lopez (LPS) is 3350 miles / 5392 kilometers, and travel time by car is about 64 hours 49 minutes.
Saint John Airport – Lopez Island Airport
Search flights
Distance from Saint John to Lopez
There are several ways to calculate the distance from Saint John to Lopez. Here are two standard methods:
Vincenty's formula (applied above)- 2646.685 miles
- 4259.426 kilometers
- 2299.906 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2639.073 miles
- 4247.176 kilometers
- 2293.292 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Saint John to Lopez?
The estimated flight time from Saint John Airport to Lopez Island Airport is 5 hours and 30 minutes.
What is the time difference between Saint John and Lopez?
The time difference between Saint John and Lopez is 4 hours. Lopez is 4 hours behind Saint John.
Flight carbon footprint between Saint John Airport (YSJ) and Lopez Island Airport (LPS)
On average, flying from Saint John to Lopez generates about 292 kg of CO2 per passenger, and 292 kilograms equals 645 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Saint John to Lopez
See the map of the shortest flight path between Saint John Airport (YSJ) and Lopez Island Airport (LPS).
Airport information
Origin | Saint John Airport |
---|---|
City: | Saint John |
Country: | Canada |
IATA Code: | YSJ |
ICAO Code: | CYSJ |
Coordinates: | 45°18′57″N, 65°53′25″W |
Destination | Lopez Island Airport |
---|---|
City: | Lopez, WA |
Country: | United States |
IATA Code: | LPS |
ICAO Code: | S31 |
Coordinates: | 48°29′2″N, 122°56′16″W |