Air Miles Calculator logo

How far is Aasiaat from Kangiqsujuaq?

The distance between Kangiqsujuaq (Kangiqsujuaq (Wakeham Bay) Airport) and Aasiaat (Aasiaat Airport) is 740 miles / 1190 kilometers / 643 nautical miles.

Kangiqsujuaq (Wakeham Bay) Airport – Aasiaat Airport

Distance arrow
740
Miles
Distance arrow
1190
Kilometers
Distance arrow
643
Nautical miles

Search flights

Distance from Kangiqsujuaq to Aasiaat

There are several ways to calculate the distance from Kangiqsujuaq to Aasiaat. Here are two standard methods:

Vincenty's formula (applied above)
  • 739.680 miles
  • 1190.399 kilometers
  • 642.764 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 737.212 miles
  • 1186.428 kilometers
  • 640.620 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangiqsujuaq to Aasiaat?

The estimated flight time from Kangiqsujuaq (Wakeham Bay) Airport to Aasiaat Airport is 1 hour and 54 minutes.

Flight carbon footprint between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Aasiaat Airport (JEG)

On average, flying from Kangiqsujuaq to Aasiaat generates about 129 kg of CO2 per passenger, and 129 kilograms equals 284 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangiqsujuaq to Aasiaat

See the map of the shortest flight path between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Aasiaat Airport (JEG).

Airport information

Origin Kangiqsujuaq (Wakeham Bay) Airport
City: Kangiqsujuaq
Country: Canada Flag of Canada
IATA Code: YWB
ICAO Code: CYKG
Coordinates: 61°35′18″N, 71°55′45″W
Destination Aasiaat Airport
City: Aasiaat
Country: Greenland Flag of Greenland
IATA Code: JEG
ICAO Code: BGAA
Coordinates: 68°43′18″N, 52°47′4″W