How far is Arctic Bay from Whale Cove?
The distance between Whale Cove (Whale Cove Airport) and Arctic Bay (Arctic Bay Airport) is 771 miles / 1241 kilometers / 670 nautical miles.
Whale Cove Airport – Arctic Bay Airport
Search flights
Distance from Whale Cove to Arctic Bay
There are several ways to calculate the distance from Whale Cove to Arctic Bay. Here are two standard methods:
Vincenty's formula (applied above)- 770.839 miles
- 1240.545 kilometers
- 669.841 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 768.489 miles
- 1236.764 kilometers
- 667.799 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Whale Cove to Arctic Bay?
The estimated flight time from Whale Cove Airport to Arctic Bay Airport is 1 hour and 57 minutes.
What is the time difference between Whale Cove and Arctic Bay?
There is no time difference between Whale Cove and Arctic Bay.
Flight carbon footprint between Whale Cove Airport (YXN) and Arctic Bay Airport (YAB)
On average, flying from Whale Cove to Arctic Bay generates about 132 kg of CO2 per passenger, and 132 kilograms equals 291 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Whale Cove to Arctic Bay
See the map of the shortest flight path between Whale Cove Airport (YXN) and Arctic Bay Airport (YAB).
Airport information
Origin | Whale Cove Airport |
---|---|
City: | Whale Cove |
Country: | Canada |
IATA Code: | YXN |
ICAO Code: | CYXN |
Coordinates: | 62°14′24″N, 92°35′53″W |
Destination | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |