How far is Bella Coola from Wabush?
The distance between Wabush (Wabush Airport) and Bella Coola (Bella Coola Airport) is 2438 miles / 3923 kilometers / 2118 nautical miles.
The driving distance from Wabush (YWK) to Bella Coola (QBC) is 3803 miles / 6120 kilometers, and travel time by car is about 82 hours 49 minutes.
Wabush Airport – Bella Coola Airport
Search flights
Distance from Wabush to Bella Coola
There are several ways to calculate the distance from Wabush to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 2437.756 miles
- 3923.187 kilometers
- 2118.352 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2429.882 miles
- 3910.517 kilometers
- 2111.510 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wabush to Bella Coola?
The estimated flight time from Wabush Airport to Bella Coola Airport is 5 hours and 6 minutes.
What is the time difference between Wabush and Bella Coola?
The time difference between Wabush and Bella Coola is 4 hours. Bella Coola is 4 hours behind Wabush.
Flight carbon footprint between Wabush Airport (YWK) and Bella Coola Airport (QBC)
On average, flying from Wabush to Bella Coola generates about 268 kg of CO2 per passenger, and 268 kilograms equals 591 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wabush to Bella Coola
See the map of the shortest flight path between Wabush Airport (YWK) and Bella Coola Airport (QBC).
Airport information
Origin | Wabush Airport |
---|---|
City: | Wabush |
Country: | Canada |
IATA Code: | YWK |
ICAO Code: | CYWK |
Coordinates: | 52°55′18″N, 66°51′51″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |