How far is Brochet from Bearskin Lake?
The distance between Bearskin Lake (Bearskin Lake Airport) and Brochet (Brochet Airport) is 494 miles / 795 kilometers / 429 nautical miles.
The driving distance from Bearskin Lake (XBE) to Brochet (YBT) is 1309 miles / 2107 kilometers, and travel time by car is about 37 hours 42 minutes.
Bearskin Lake Airport – Brochet Airport
Search flights
Distance from Bearskin Lake to Brochet
There are several ways to calculate the distance from Bearskin Lake to Brochet. Here are two standard methods:
Vincenty's formula (applied above)- 493.987 miles
- 794.995 kilometers
- 429.263 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 492.616 miles
- 792.788 kilometers
- 428.072 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bearskin Lake to Brochet?
The estimated flight time from Bearskin Lake Airport to Brochet Airport is 1 hour and 26 minutes.
What is the time difference between Bearskin Lake and Brochet?
There is no time difference between Bearskin Lake and Brochet.
Flight carbon footprint between Bearskin Lake Airport (XBE) and Brochet Airport (YBT)
On average, flying from Bearskin Lake to Brochet generates about 98 kg of CO2 per passenger, and 98 kilograms equals 215 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bearskin Lake to Brochet
See the map of the shortest flight path between Bearskin Lake Airport (XBE) and Brochet Airport (YBT).
Airport information
Origin | Bearskin Lake Airport |
---|---|
City: | Bearskin Lake |
Country: | Canada |
IATA Code: | XBE |
ICAO Code: | CNE3 |
Coordinates: | 53°57′56″N, 91°1′37″W |
Destination | Brochet Airport |
---|---|
City: | Brochet |
Country: | Canada |
IATA Code: | YBT |
ICAO Code: | CYBT |
Coordinates: | 57°53′21″N, 101°40′44″W |