How far is Bella Coola from Summer Beaver?
The distance between Summer Beaver (Summer Beaver Airport) and Bella Coola (Bella Coola Airport) is 1585 miles / 2551 kilometers / 1378 nautical miles.
The driving distance from Summer Beaver (SUR) to Bella Coola (QBC) is 2147 miles / 3455 kilometers, and travel time by car is about 48 hours 33 minutes.
Summer Beaver Airport – Bella Coola Airport
Search flights
Distance from Summer Beaver to Bella Coola
There are several ways to calculate the distance from Summer Beaver to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 1585.359 miles
- 2551.388 kilometers
- 1377.639 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1580.245 miles
- 2543.157 kilometers
- 1373.195 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Summer Beaver to Bella Coola?
The estimated flight time from Summer Beaver Airport to Bella Coola Airport is 3 hours and 30 minutes.
What is the time difference between Summer Beaver and Bella Coola?
Flight carbon footprint between Summer Beaver Airport (SUR) and Bella Coola Airport (QBC)
On average, flying from Summer Beaver to Bella Coola generates about 185 kg of CO2 per passenger, and 185 kilograms equals 408 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Summer Beaver to Bella Coola
See the map of the shortest flight path between Summer Beaver Airport (SUR) and Bella Coola Airport (QBC).
Airport information
Origin | Summer Beaver Airport |
---|---|
City: | Summer Beaver |
Country: | Canada |
IATA Code: | SUR |
ICAO Code: | CJV7 |
Coordinates: | 52°42′30″N, 88°32′30″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |