How far is Paamiut from Thunder Bay?
The distance between Thunder Bay (Thunder Bay International Airport) and Paamiut (Paamiut Airport) is 1790 miles / 2880 kilometers / 1555 nautical miles.
Thunder Bay International Airport – Paamiut Airport
Search flights
Distance from Thunder Bay to Paamiut
There are several ways to calculate the distance from Thunder Bay to Paamiut. Here are two standard methods:
Vincenty's formula (applied above)- 1789.809 miles
- 2880.418 kilometers
- 1555.301 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1784.903 miles
- 2872.523 kilometers
- 1551.039 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Thunder Bay to Paamiut?
The estimated flight time from Thunder Bay International Airport to Paamiut Airport is 3 hours and 53 minutes.
What is the time difference between Thunder Bay and Paamiut?
Flight carbon footprint between Thunder Bay International Airport (YQT) and Paamiut Airport (JFR)
On average, flying from Thunder Bay to Paamiut generates about 199 kg of CO2 per passenger, and 199 kilograms equals 439 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Thunder Bay to Paamiut
See the map of the shortest flight path between Thunder Bay International Airport (YQT) and Paamiut Airport (JFR).
Airport information
Origin | Thunder Bay International Airport |
---|---|
City: | Thunder Bay |
Country: | Canada |
IATA Code: | YQT |
ICAO Code: | CYQT |
Coordinates: | 48°22′18″N, 89°19′26″W |
Destination | Paamiut Airport |
---|---|
City: | Paamiut |
Country: | Greenland |
IATA Code: | JFR |
ICAO Code: | BGPT |
Coordinates: | 62°0′53″N, 49°40′15″W |