How far is Bella Coola from Arviat?
The distance between Arviat (Arviat Airport) and Bella Coola (Bella Coola Airport) is 1357 miles / 2183 kilometers / 1179 nautical miles.
The driving distance from Arviat (YEK) to Bella Coola (QBC) is 1891 miles / 3043 kilometers, and travel time by car is about 44 hours 29 minutes.
Arviat Airport – Bella Coola Airport
Search flights
Distance from Arviat to Bella Coola
There are several ways to calculate the distance from Arviat to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 1356.620 miles
- 2183.268 kilometers
- 1178.871 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1352.484 miles
- 2176.613 kilometers
- 1175.277 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arviat to Bella Coola?
The estimated flight time from Arviat Airport to Bella Coola Airport is 3 hours and 4 minutes.
What is the time difference between Arviat and Bella Coola?
The time difference between Arviat and Bella Coola is 2 hours. Bella Coola is 2 hours behind Arviat.
Flight carbon footprint between Arviat Airport (YEK) and Bella Coola Airport (QBC)
On average, flying from Arviat to Bella Coola generates about 170 kg of CO2 per passenger, and 170 kilograms equals 376 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Arviat to Bella Coola
See the map of the shortest flight path between Arviat Airport (YEK) and Bella Coola Airport (QBC).
Airport information
Origin | Arviat Airport |
---|---|
City: | Arviat |
Country: | Canada |
IATA Code: | YEK |
ICAO Code: | CYEK |
Coordinates: | 61°5′39″N, 94°4′14″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |