How far is Whale Cove from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Whale Cove (Whale Cove Airport) is 771 miles / 1241 kilometers / 670 nautical miles.
Arctic Bay Airport – Whale Cove Airport
Search flights
Distance from Arctic Bay to Whale Cove
There are several ways to calculate the distance from Arctic Bay to Whale Cove. Here are two standard methods:
Vincenty's formula (applied above)- 770.839 miles
- 1240.545 kilometers
- 669.841 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 768.489 miles
- 1236.764 kilometers
- 667.799 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Whale Cove?
The estimated flight time from Arctic Bay Airport to Whale Cove Airport is 1 hour and 57 minutes.
What is the time difference between Arctic Bay and Whale Cove?
There is no time difference between Arctic Bay and Whale Cove.
Flight carbon footprint between Arctic Bay Airport (YAB) and Whale Cove Airport (YXN)
On average, flying from Arctic Bay to Whale Cove generates about 132 kg of CO2 per passenger, and 132 kilograms equals 291 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Whale Cove
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Whale Cove Airport (YXN).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Whale Cove Airport |
---|---|
City: | Whale Cove |
Country: | Canada |
IATA Code: | YXN |
ICAO Code: | CYXN |
Coordinates: | 62°14′24″N, 92°35′53″W |