How far is Paamiut from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Paamiut (Paamiut Airport) is 1178 miles / 1896 kilometers / 1024 nautical miles.
Arctic Bay Airport – Paamiut Airport
Search flights
Distance from Arctic Bay to Paamiut
There are several ways to calculate the distance from Arctic Bay to Paamiut. Here are two standard methods:
Vincenty's formula (applied above)- 1178.264 miles
- 1896.232 kilometers
- 1023.883 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1174.091 miles
- 1889.516 kilometers
- 1020.257 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Paamiut?
The estimated flight time from Arctic Bay Airport to Paamiut Airport is 2 hours and 43 minutes.
What is the time difference between Arctic Bay and Paamiut?
Flight carbon footprint between Arctic Bay Airport (YAB) and Paamiut Airport (JFR)
On average, flying from Arctic Bay to Paamiut generates about 161 kg of CO2 per passenger, and 161 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Paamiut
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Paamiut Airport (JFR).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Paamiut Airport |
---|---|
City: | Paamiut |
Country: | Greenland |
IATA Code: | JFR |
ICAO Code: | BGPT |
Coordinates: | 62°0′53″N, 49°40′15″W |