How far is Prince George from Lopez, WA?
The distance between Lopez (Lopez Island Airport) and Prince George (Prince George Airport) is 374 miles / 602 kilometers / 325 nautical miles.
The driving distance from Lopez (LPS) to Prince George (YXS) is 531 miles / 854 kilometers, and travel time by car is about 12 hours 14 minutes.
Lopez Island Airport – Prince George Airport
Search flights
Distance from Lopez to Prince George
There are several ways to calculate the distance from Lopez to Prince George. Here are two standard methods:
Vincenty's formula (applied above)- 373.843 miles
- 601.643 kilometers
- 324.861 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 373.652 miles
- 601.335 kilometers
- 324.695 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lopez to Prince George?
The estimated flight time from Lopez Island Airport to Prince George Airport is 1 hour and 12 minutes.
What is the time difference between Lopez and Prince George?
There is no time difference between Lopez and Prince George.
Flight carbon footprint between Lopez Island Airport (LPS) and Prince George Airport (YXS)
On average, flying from Lopez to Prince George generates about 80 kg of CO2 per passenger, and 80 kilograms equals 176 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lopez to Prince George
See the map of the shortest flight path between Lopez Island Airport (LPS) and Prince George Airport (YXS).
Airport information
Origin | Lopez Island Airport |
---|---|
City: | Lopez, WA |
Country: | United States |
IATA Code: | LPS |
ICAO Code: | S31 |
Coordinates: | 48°29′2″N, 122°56′16″W |
Destination | Prince George Airport |
---|---|
City: | Prince George |
Country: | Canada |
IATA Code: | YXS |
ICAO Code: | CYXS |
Coordinates: | 53°53′21″N, 122°40′44″W |