Air Miles Calculator logo

How far is Lopez, WA, from New Haven, CT?

The distance between New Haven (Tweed New Haven Airport) and Lopez (Lopez Island Airport) is 2462 miles / 3962 kilometers / 2139 nautical miles.

The driving distance from New Haven (HVN) to Lopez (LPS) is 3006 miles / 4837 kilometers, and travel time by car is about 55 hours 18 minutes.

Tweed New Haven Airport – Lopez Island Airport

Distance arrow
2462
Miles
Distance arrow
3962
Kilometers
Distance arrow
2139
Nautical miles

Search flights

Distance from New Haven to Lopez

There are several ways to calculate the distance from New Haven to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2461.785 miles
  • 3961.859 kilometers
  • 2139.233 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2455.290 miles
  • 3951.406 kilometers
  • 2133.589 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from New Haven to Lopez?

The estimated flight time from Tweed New Haven Airport to Lopez Island Airport is 5 hours and 9 minutes.

Flight carbon footprint between Tweed New Haven Airport (HVN) and Lopez Island Airport (LPS)

On average, flying from New Haven to Lopez generates about 271 kg of CO2 per passenger, and 271 kilograms equals 597 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from New Haven to Lopez

See the map of the shortest flight path between Tweed New Haven Airport (HVN) and Lopez Island Airport (LPS).

Airport information

Origin Tweed New Haven Airport
City: New Haven, CT
Country: United States Flag of United States
IATA Code: HVN
ICAO Code: KHVN
Coordinates: 41°15′49″N, 72°53′12″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W