How far is Bearskin Lake from Natashquan?
The distance between Natashquan (Natashquan Airport) and Bearskin Lake (Bearskin Lake Airport) is 1263 miles / 2033 kilometers / 1098 nautical miles.
The driving distance from Natashquan (YNA) to Bearskin Lake (XBE) is 2131 miles / 3429 kilometers, and travel time by car is about 56 hours 39 minutes.
Natashquan Airport – Bearskin Lake Airport
Search flights
Distance from Natashquan to Bearskin Lake
There are several ways to calculate the distance from Natashquan to Bearskin Lake. Here are two standard methods:
Vincenty's formula (applied above)- 1263.006 miles
- 2032.612 kilometers
- 1097.523 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1259.104 miles
- 2026.331 kilometers
- 1094.131 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Natashquan to Bearskin Lake?
The estimated flight time from Natashquan Airport to Bearskin Lake Airport is 2 hours and 53 minutes.
What is the time difference between Natashquan and Bearskin Lake?
Flight carbon footprint between Natashquan Airport (YNA) and Bearskin Lake Airport (XBE)
On average, flying from Natashquan to Bearskin Lake generates about 164 kg of CO2 per passenger, and 164 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Natashquan to Bearskin Lake
See the map of the shortest flight path between Natashquan Airport (YNA) and Bearskin Lake Airport (XBE).
Airport information
Origin | Natashquan Airport |
---|---|
City: | Natashquan |
Country: | Canada |
IATA Code: | YNA |
ICAO Code: | CYNA |
Coordinates: | 50°11′23″N, 61°47′21″W |
Destination | Bearskin Lake Airport |
---|---|
City: | Bearskin Lake |
Country: | Canada |
IATA Code: | XBE |
ICAO Code: | CNE3 |
Coordinates: | 53°57′56″N, 91°1′37″W |