How far is Iqaluit from Whale Cove?
The distance between Whale Cove (Whale Cove Airport) and Iqaluit (Iqaluit Airport) is 760 miles / 1223 kilometers / 660 nautical miles.
The driving distance from Whale Cove (YXN) to Iqaluit (YFB) is 3764 miles / 6058 kilometers, and travel time by car is about 108 hours 17 minutes.
Whale Cove Airport – Iqaluit Airport
Search flights
Distance from Whale Cove to Iqaluit
There are several ways to calculate the distance from Whale Cove to Iqaluit. Here are two standard methods:
Vincenty's formula (applied above)- 759.658 miles
- 1222.551 kilometers
- 660.125 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 756.810 miles
- 1217.968 kilometers
- 657.650 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Whale Cove to Iqaluit?
The estimated flight time from Whale Cove Airport to Iqaluit Airport is 1 hour and 56 minutes.
What is the time difference between Whale Cove and Iqaluit?
The time difference between Whale Cove and Iqaluit is 1 hour. Iqaluit is 1 hour ahead of Whale Cove.
Flight carbon footprint between Whale Cove Airport (YXN) and Iqaluit Airport (YFB)
On average, flying from Whale Cove to Iqaluit generates about 131 kg of CO2 per passenger, and 131 kilograms equals 289 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Whale Cove to Iqaluit
See the map of the shortest flight path between Whale Cove Airport (YXN) and Iqaluit Airport (YFB).
Airport information
Origin | Whale Cove Airport |
---|---|
City: | Whale Cove |
Country: | Canada |
IATA Code: | YXN |
ICAO Code: | CYXN |
Coordinates: | 62°14′24″N, 92°35′53″W |
Destination | Iqaluit Airport |
---|---|
City: | Iqaluit |
Country: | Canada |
IATA Code: | YFB |
ICAO Code: | CYFB |
Coordinates: | 63°45′23″N, 68°33′20″W |