Air Miles Calculator logo

How far is Latrobe, PA, from Thunder Bay?

The distance between Thunder Bay (Thunder Bay International Airport) and Latrobe (Arnold Palmer Regional Airport) is 743 miles / 1196 kilometers / 646 nautical miles.

The driving distance from Thunder Bay (YQT) to Latrobe (LBE) is 1092 miles / 1757 kilometers, and travel time by car is about 21 hours 46 minutes.

Thunder Bay International Airport – Arnold Palmer Regional Airport

Distance arrow
743
Miles
Distance arrow
1196
Kilometers
Distance arrow
646
Nautical miles

Search flights

Distance from Thunder Bay to Latrobe

There are several ways to calculate the distance from Thunder Bay to Latrobe. Here are two standard methods:

Vincenty's formula (applied above)
  • 743.197 miles
  • 1196.060 kilometers
  • 645.821 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 742.601 miles
  • 1195.101 kilometers
  • 645.303 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Thunder Bay to Latrobe?

The estimated flight time from Thunder Bay International Airport to Arnold Palmer Regional Airport is 1 hour and 54 minutes.

What is the time difference between Thunder Bay and Latrobe?

There is no time difference between Thunder Bay and Latrobe.

Flight carbon footprint between Thunder Bay International Airport (YQT) and Arnold Palmer Regional Airport (LBE)

On average, flying from Thunder Bay to Latrobe generates about 129 kg of CO2 per passenger, and 129 kilograms equals 285 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Thunder Bay to Latrobe

See the map of the shortest flight path between Thunder Bay International Airport (YQT) and Arnold Palmer Regional Airport (LBE).

Airport information

Origin Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W
Destination Arnold Palmer Regional Airport
City: Latrobe, PA
Country: United States Flag of United States
IATA Code: LBE
ICAO Code: KLBE
Coordinates: 40°16′33″N, 79°24′17″W