How far is Quebec from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Quebec (Québec City Jean Lesage International Airport) is 1865 miles / 3001 kilometers / 1621 nautical miles.
Arctic Bay Airport – Québec City Jean Lesage International Airport
Search flights
Distance from Arctic Bay to Quebec
There are several ways to calculate the distance from Arctic Bay to Quebec. Here are two standard methods:
Vincenty's formula (applied above)- 1864.879 miles
- 3001.232 kilometers
- 1620.536 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1861.267 miles
- 2995.419 kilometers
- 1617.397 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Quebec?
The estimated flight time from Arctic Bay Airport to Québec City Jean Lesage International Airport is 4 hours and 1 minutes.
What is the time difference between Arctic Bay and Quebec?
The time difference between Arctic Bay and Quebec is 1 hour. Quebec is 1 hour ahead of Arctic Bay.
Flight carbon footprint between Arctic Bay Airport (YAB) and Québec City Jean Lesage International Airport (YQB)
On average, flying from Arctic Bay to Quebec generates about 205 kg of CO2 per passenger, and 205 kilograms equals 453 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Quebec
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Québec City Jean Lesage International Airport (YQB).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Québec City Jean Lesage International Airport |
---|---|
City: | Quebec |
Country: | Canada |
IATA Code: | YQB |
ICAO Code: | CYQB |
Coordinates: | 46°47′27″N, 71°23′35″W |