Air Miles Calculator logo

How far is Thunder Bay from Hamburg?

The distance between Hamburg (Hamburg Airport) and Thunder Bay (Thunder Bay International Airport) is 3981 miles / 6407 kilometers / 3460 nautical miles.

Hamburg Airport – Thunder Bay International Airport

Distance arrow
3981
Miles
Distance arrow
6407
Kilometers
Distance arrow
3460
Nautical miles

Search flights

Distance from Hamburg to Thunder Bay

There are several ways to calculate the distance from Hamburg to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 3981.285 miles
  • 6407.257 kilometers
  • 3459.642 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3968.973 miles
  • 6387.443 kilometers
  • 3448.944 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamburg to Thunder Bay?

The estimated flight time from Hamburg Airport to Thunder Bay International Airport is 8 hours and 2 minutes.

Flight carbon footprint between Hamburg Airport (HAM) and Thunder Bay International Airport (YQT)

On average, flying from Hamburg to Thunder Bay generates about 454 kg of CO2 per passenger, and 454 kilograms equals 1 001 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamburg to Thunder Bay

See the map of the shortest flight path between Hamburg Airport (HAM) and Thunder Bay International Airport (YQT).

Airport information

Origin Hamburg Airport
City: Hamburg
Country: Germany Flag of Germany
IATA Code: HAM
ICAO Code: EDDH
Coordinates: 53°37′49″N, 9°59′17″E
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W