How far is Bella Coola from Fort Chipewyan?
The distance between Fort Chipewyan (Fort Chipewyan Airport) and Bella Coola (Bella Coola Airport) is 747 miles / 1203 kilometers / 649 nautical miles.
The driving distance from Fort Chipewyan (YPY) to Bella Coola (QBC) is 1278 miles / 2056 kilometers, and travel time by car is about 30 hours 5 minutes.
Fort Chipewyan Airport – Bella Coola Airport
Search flights
Distance from Fort Chipewyan to Bella Coola
There are several ways to calculate the distance from Fort Chipewyan to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 747.304 miles
- 1202.669 kilometers
- 649.389 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 745.330 miles
- 1199.493 kilometers
- 647.674 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Fort Chipewyan to Bella Coola?
The estimated flight time from Fort Chipewyan Airport to Bella Coola Airport is 1 hour and 54 minutes.
What is the time difference between Fort Chipewyan and Bella Coola?
Flight carbon footprint between Fort Chipewyan Airport (YPY) and Bella Coola Airport (QBC)
On average, flying from Fort Chipewyan to Bella Coola generates about 130 kg of CO2 per passenger, and 130 kilograms equals 286 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Fort Chipewyan to Bella Coola
See the map of the shortest flight path between Fort Chipewyan Airport (YPY) and Bella Coola Airport (QBC).
Airport information
Origin | Fort Chipewyan Airport |
---|---|
City: | Fort Chipewyan |
Country: | Canada |
IATA Code: | YPY |
ICAO Code: | CYPY |
Coordinates: | 58°46′1″N, 111°7′1″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |