Air Miles Calculator logo

How far is Kuujjuarapik from Churchill?

The distance between Churchill (Churchill Airport) and Kuujjuarapik (Kuujjuarapik Airport) is 658 miles / 1059 kilometers / 572 nautical miles.

The driving distance from Churchill (YYQ) to Kuujjuarapik (YGW) is 2208 miles / 3553 kilometers, and travel time by car is about 51 hours 51 minutes.

Churchill Airport – Kuujjuarapik Airport

Distance arrow
658
Miles
Distance arrow
1059
Kilometers
Distance arrow
572
Nautical miles

Search flights

Distance from Churchill to Kuujjuarapik

There are several ways to calculate the distance from Churchill to Kuujjuarapik. Here are two standard methods:

Vincenty's formula (applied above)
  • 658.135 miles
  • 1059.165 kilometers
  • 571.904 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 656.024 miles
  • 1055.768 kilometers
  • 570.069 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Churchill to Kuujjuarapik?

The estimated flight time from Churchill Airport to Kuujjuarapik Airport is 1 hour and 44 minutes.

Flight carbon footprint between Churchill Airport (YYQ) and Kuujjuarapik Airport (YGW)

On average, flying from Churchill to Kuujjuarapik generates about 120 kg of CO2 per passenger, and 120 kilograms equals 264 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Churchill to Kuujjuarapik

See the map of the shortest flight path between Churchill Airport (YYQ) and Kuujjuarapik Airport (YGW).

Airport information

Origin Churchill Airport
City: Churchill
Country: Canada Flag of Canada
IATA Code: YYQ
ICAO Code: CYYQ
Coordinates: 58°44′21″N, 94°3′54″W
Destination Kuujjuarapik Airport
City: Kuujjuarapik
Country: Canada Flag of Canada
IATA Code: YGW
ICAO Code: CYGW
Coordinates: 55°16′54″N, 77°45′55″W