How far is Arctic Bay from Quebec?
The distance between Quebec (Québec City Jean Lesage International Airport) and Arctic Bay (Arctic Bay Airport) is 1865 miles / 3001 kilometers / 1621 nautical miles.
Québec City Jean Lesage International Airport – Arctic Bay Airport
Search flights
Distance from Quebec to Arctic Bay
There are several ways to calculate the distance from Quebec to Arctic Bay. Here are two standard methods:
Vincenty's formula (applied above)- 1864.879 miles
- 3001.232 kilometers
- 1620.536 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1861.267 miles
- 2995.419 kilometers
- 1617.397 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Quebec to Arctic Bay?
The estimated flight time from Québec City Jean Lesage International Airport to Arctic Bay Airport is 4 hours and 1 minutes.
What is the time difference between Quebec and Arctic Bay?
The time difference between Quebec and Arctic Bay is 1 hour. Arctic Bay is 1 hour behind Quebec.
Flight carbon footprint between Québec City Jean Lesage International Airport (YQB) and Arctic Bay Airport (YAB)
On average, flying from Quebec to Arctic Bay generates about 205 kg of CO2 per passenger, and 205 kilograms equals 453 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Quebec to Arctic Bay
See the map of the shortest flight path between Québec City Jean Lesage International Airport (YQB) and Arctic Bay Airport (YAB).
Airport information
Origin | Québec City Jean Lesage International Airport |
---|---|
City: | Quebec |
Country: | Canada |
IATA Code: | YQB |
ICAO Code: | CYQB |
Coordinates: | 46°47′27″N, 71°23′35″W |
Destination | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |