How far is Lopez, WA, from Bar Harbor, ME?
The distance between Bar Harbor (Hancock County–Bar Harbor Airport) and Lopez (Lopez Island Airport) is 2564 miles / 4126 kilometers / 2228 nautical miles.
The driving distance from Bar Harbor (BHB) to Lopez (LPS) is 3203 miles / 5154 kilometers, and travel time by car is about 61 hours 51 minutes.
Hancock County–Bar Harbor Airport – Lopez Island Airport
Search flights
Distance from Bar Harbor to Lopez
There are several ways to calculate the distance from Bar Harbor to Lopez. Here are two standard methods:
Vincenty's formula (applied above)- 2563.895 miles
- 4126.189 kilometers
- 2227.964 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2556.628 miles
- 4114.493 kilometers
- 2221.649 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bar Harbor to Lopez?
The estimated flight time from Hancock County–Bar Harbor Airport to Lopez Island Airport is 5 hours and 21 minutes.
What is the time difference between Bar Harbor and Lopez?
The time difference between Bar Harbor and Lopez is 3 hours. Lopez is 3 hours behind Bar Harbor.
Flight carbon footprint between Hancock County–Bar Harbor Airport (BHB) and Lopez Island Airport (LPS)
On average, flying from Bar Harbor to Lopez generates about 283 kg of CO2 per passenger, and 283 kilograms equals 623 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bar Harbor to Lopez
See the map of the shortest flight path between Hancock County–Bar Harbor Airport (BHB) and Lopez Island Airport (LPS).
Airport information
Origin | Hancock County–Bar Harbor Airport |
---|---|
City: | Bar Harbor, ME |
Country: | United States |
IATA Code: | BHB |
ICAO Code: | KBHB |
Coordinates: | 44°27′0″N, 68°21′41″W |
Destination | Lopez Island Airport |
---|---|
City: | Lopez, WA |
Country: | United States |
IATA Code: | LPS |
ICAO Code: | S31 |
Coordinates: | 48°29′2″N, 122°56′16″W |