How far is Brandon from Bella Coola?
The distance between Bella Coola (Bella Coola Airport) and Brandon (Brandon Municipal Airport) is 1164 miles / 1874 kilometers / 1012 nautical miles.
The driving distance from Bella Coola (QBC) to Brandon (YBR) is 1527 miles / 2458 kilometers, and travel time by car is about 32 hours 2 minutes.
Bella Coola Airport – Brandon Municipal Airport
Search flights
Distance from Bella Coola to Brandon
There are several ways to calculate the distance from Bella Coola to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 1164.247 miles
- 1873.674 kilometers
- 1011.703 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1160.650 miles
- 1867.885 kilometers
- 1008.577 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bella Coola to Brandon?
The estimated flight time from Bella Coola Airport to Brandon Municipal Airport is 2 hours and 42 minutes.
What is the time difference between Bella Coola and Brandon?
Flight carbon footprint between Bella Coola Airport (QBC) and Brandon Municipal Airport (YBR)
On average, flying from Bella Coola to Brandon generates about 160 kg of CO2 per passenger, and 160 kilograms equals 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bella Coola to Brandon
See the map of the shortest flight path between Bella Coola Airport (QBC) and Brandon Municipal Airport (YBR).
Airport information
Origin | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |