Air Miles Calculator logo

How far is Lopez, WA, from Kingman, AZ?

The distance between Kingman (Kingman Airport (Arizona)) and Lopez (Lopez Island Airport) is 1022 miles / 1645 kilometers / 888 nautical miles.

The driving distance from Kingman (IGM) to Lopez (LPS) is 1325 miles / 2132 kilometers, and travel time by car is about 25 hours 34 minutes.

Kingman Airport (Arizona) – Lopez Island Airport

Distance arrow
1022
Miles
Distance arrow
1645
Kilometers
Distance arrow
888
Nautical miles

Search flights

Distance from Kingman to Lopez

There are several ways to calculate the distance from Kingman to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1022.294 miles
  • 1645.223 kilometers
  • 888.349 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1022.664 miles
  • 1645.819 kilometers
  • 888.671 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingman to Lopez?

The estimated flight time from Kingman Airport (Arizona) to Lopez Island Airport is 2 hours and 26 minutes.

Flight carbon footprint between Kingman Airport (Arizona) (IGM) and Lopez Island Airport (LPS)

On average, flying from Kingman to Lopez generates about 152 kg of CO2 per passenger, and 152 kilograms equals 336 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kingman to Lopez

See the map of the shortest flight path between Kingman Airport (Arizona) (IGM) and Lopez Island Airport (LPS).

Airport information

Origin Kingman Airport (Arizona)
City: Kingman, AZ
Country: United States Flag of United States
IATA Code: IGM
ICAO Code: KIGM
Coordinates: 35°15′34″N, 113°56′16″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W