How far is Bearskin Lake from Port Hardy?
The distance between Port Hardy (Port Hardy Airport) and Bearskin Lake (Bearskin Lake Airport) is 1539 miles / 2476 kilometers / 1337 nautical miles.
The driving distance from Port Hardy (YZT) to Bearskin Lake (XBE) is 2298 miles / 3699 kilometers, and travel time by car is about 55 hours 30 minutes.
Port Hardy Airport – Bearskin Lake Airport
Search flights
Distance from Port Hardy to Bearskin Lake
There are several ways to calculate the distance from Port Hardy to Bearskin Lake. Here are two standard methods:
Vincenty's formula (applied above)- 1538.817 miles
- 2476.486 kilometers
- 1337.196 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1533.958 miles
- 2468.666 kilometers
- 1332.973 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Port Hardy to Bearskin Lake?
The estimated flight time from Port Hardy Airport to Bearskin Lake Airport is 3 hours and 24 minutes.
What is the time difference between Port Hardy and Bearskin Lake?
Flight carbon footprint between Port Hardy Airport (YZT) and Bearskin Lake Airport (XBE)
On average, flying from Port Hardy to Bearskin Lake generates about 182 kg of CO2 per passenger, and 182 kilograms equals 401 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Port Hardy to Bearskin Lake
See the map of the shortest flight path between Port Hardy Airport (YZT) and Bearskin Lake Airport (XBE).
Airport information
Origin | Port Hardy Airport |
---|---|
City: | Port Hardy |
Country: | Canada |
IATA Code: | YZT |
ICAO Code: | CYZT |
Coordinates: | 50°40′50″N, 127°22′1″W |
Destination | Bearskin Lake Airport |
---|---|
City: | Bearskin Lake |
Country: | Canada |
IATA Code: | XBE |
ICAO Code: | CNE3 |
Coordinates: | 53°57′56″N, 91°1′37″W |