Air Miles Calculator logo

How far is Lopez, WA, from Kitchener?

The distance between Kitchener (Region of Waterloo International Airport) and Lopez (Lopez Island Airport) is 2051 miles / 3301 kilometers / 1783 nautical miles.

The driving distance from Kitchener (YKF) to Lopez (LPS) is 2500 miles / 4023 kilometers, and travel time by car is about 46 hours 35 minutes.

Region of Waterloo International Airport – Lopez Island Airport

Distance arrow
2051
Miles
Distance arrow
3301
Kilometers
Distance arrow
1783
Nautical miles

Search flights

Distance from Kitchener to Lopez

There are several ways to calculate the distance from Kitchener to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2051.384 miles
  • 3301.382 kilometers
  • 1782.604 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2045.740 miles
  • 3292.299 kilometers
  • 1777.699 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kitchener to Lopez?

The estimated flight time from Region of Waterloo International Airport to Lopez Island Airport is 4 hours and 23 minutes.

Flight carbon footprint between Region of Waterloo International Airport (YKF) and Lopez Island Airport (LPS)

On average, flying from Kitchener to Lopez generates about 223 kg of CO2 per passenger, and 223 kilograms equals 492 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kitchener to Lopez

See the map of the shortest flight path between Region of Waterloo International Airport (YKF) and Lopez Island Airport (LPS).

Airport information

Origin Region of Waterloo International Airport
City: Kitchener
Country: Canada Flag of Canada
IATA Code: YKF
ICAO Code: CYKF
Coordinates: 43°27′38″N, 80°22′42″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W