How far is Lopez, WA, from Cartwright?
The distance between Cartwright (Cartwright Airport) and Lopez (Lopez Island Airport) is 2787 miles / 4485 kilometers / 2422 nautical miles.
The driving distance from Cartwright (YRF) to Lopez (LPS) is 4210 miles / 6776 kilometers, and travel time by car is about 85 hours 49 minutes.
Cartwright Airport – Lopez Island Airport
Search flights
Distance from Cartwright to Lopez
There are several ways to calculate the distance from Cartwright to Lopez. Here are two standard methods:
Vincenty's formula (applied above)- 2787.118 miles
- 4485.431 kilometers
- 2421.939 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2778.501 miles
- 4471.563 kilometers
- 2414.451 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Cartwright to Lopez?
The estimated flight time from Cartwright Airport to Lopez Island Airport is 5 hours and 46 minutes.
What is the time difference between Cartwright and Lopez?
The time difference between Cartwright and Lopez is 4 hours. Lopez is 4 hours behind Cartwright.
Flight carbon footprint between Cartwright Airport (YRF) and Lopez Island Airport (LPS)
On average, flying from Cartwright to Lopez generates about 309 kg of CO2 per passenger, and 309 kilograms equals 681 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Cartwright to Lopez
See the map of the shortest flight path between Cartwright Airport (YRF) and Lopez Island Airport (LPS).
Airport information
Origin | Cartwright Airport |
---|---|
City: | Cartwright |
Country: | Canada |
IATA Code: | YRF |
ICAO Code: | CYCA |
Coordinates: | 53°40′58″N, 57°2′30″W |
Destination | Lopez Island Airport |
---|---|
City: | Lopez, WA |
Country: | United States |
IATA Code: | LPS |
ICAO Code: | S31 |
Coordinates: | 48°29′2″N, 122°56′16″W |