Air Miles Calculator logo

How far is Kangiqsujuaq from Paamiut?

The distance between Paamiut (Paamiut Airport) and Kangiqsujuaq (Kangiqsujuaq (Wakeham Bay) Airport) is 726 miles / 1169 kilometers / 631 nautical miles.

Paamiut Airport – Kangiqsujuaq (Wakeham Bay) Airport

Distance arrow
726
Miles
Distance arrow
1169
Kilometers
Distance arrow
631
Nautical miles

Search flights

Distance from Paamiut to Kangiqsujuaq

There are several ways to calculate the distance from Paamiut to Kangiqsujuaq. Here are two standard methods:

Vincenty's formula (applied above)
  • 726.425 miles
  • 1169.068 kilometers
  • 631.246 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 723.727 miles
  • 1164.725 kilometers
  • 628.901 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Paamiut to Kangiqsujuaq?

The estimated flight time from Paamiut Airport to Kangiqsujuaq (Wakeham Bay) Airport is 1 hour and 52 minutes.

Flight carbon footprint between Paamiut Airport (JFR) and Kangiqsujuaq (Wakeham Bay) Airport (YWB)

On average, flying from Paamiut to Kangiqsujuaq generates about 127 kg of CO2 per passenger, and 127 kilograms equals 281 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Paamiut to Kangiqsujuaq

See the map of the shortest flight path between Paamiut Airport (JFR) and Kangiqsujuaq (Wakeham Bay) Airport (YWB).

Airport information

Origin Paamiut Airport
City: Paamiut
Country: Greenland Flag of Greenland
IATA Code: JFR
ICAO Code: BGPT
Coordinates: 62°0′53″N, 49°40′15″W
Destination Kangiqsujuaq (Wakeham Bay) Airport
City: Kangiqsujuaq
Country: Canada Flag of Canada
IATA Code: YWB
ICAO Code: CYKG
Coordinates: 61°35′18″N, 71°55′45″W