How far is Thunder Bay from Lake Havasu City, AZ?
The distance between Lake Havasu City (Lake Havasu City Airport) and Thunder Bay (Thunder Bay International Airport) is 1599 miles / 2573 kilometers / 1389 nautical miles.
The driving distance from Lake Havasu City (HII) to Thunder Bay (YQT) is 2086 miles / 3357 kilometers, and travel time by car is about 38 hours 1 minutes.
Lake Havasu City Airport – Thunder Bay International Airport
Search flights
Distance from Lake Havasu City to Thunder Bay
There are several ways to calculate the distance from Lake Havasu City to Thunder Bay. Here are two standard methods:
Vincenty's formula (applied above)- 1598.818 miles
- 2573.048 kilometers
- 1389.335 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1596.838 miles
- 2569.861 kilometers
- 1387.614 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lake Havasu City to Thunder Bay?
The estimated flight time from Lake Havasu City Airport to Thunder Bay International Airport is 3 hours and 31 minutes.
What is the time difference between Lake Havasu City and Thunder Bay?
Flight carbon footprint between Lake Havasu City Airport (HII) and Thunder Bay International Airport (YQT)
On average, flying from Lake Havasu City to Thunder Bay generates about 186 kg of CO2 per passenger, and 186 kilograms equals 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lake Havasu City to Thunder Bay
See the map of the shortest flight path between Lake Havasu City Airport (HII) and Thunder Bay International Airport (YQT).
Airport information
Origin | Lake Havasu City Airport |
---|---|
City: | Lake Havasu City, AZ |
Country: | United States |
IATA Code: | HII |
ICAO Code: | KHII |
Coordinates: | 34°34′15″N, 114°21′28″W |
Destination | Thunder Bay International Airport |
---|---|
City: | Thunder Bay |
Country: | Canada |
IATA Code: | YQT |
ICAO Code: | CYQT |
Coordinates: | 48°22′18″N, 89°19′26″W |