How far is Fort St.John from Bella Coola?
The distance between Bella Coola (Bella Coola Airport) and Fort St.John (Fort St. John Airport) is 356 miles / 573 kilometers / 309 nautical miles.
The driving distance from Bella Coola (QBC) to Fort St.John (YXJ) is 690 miles / 1111 kilometers, and travel time by car is about 15 hours 42 minutes.
Bella Coola Airport – Fort St. John Airport
Search flights
Distance from Bella Coola to Fort St.John
There are several ways to calculate the distance from Bella Coola to Fort St.John. Here are two standard methods:
Vincenty's formula (applied above)- 356.144 miles
- 573.158 kilometers
- 309.481 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 355.416 miles
- 571.986 kilometers
- 308.848 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bella Coola to Fort St.John?
The estimated flight time from Bella Coola Airport to Fort St. John Airport is 1 hour and 10 minutes.
What is the time difference between Bella Coola and Fort St.John?
Flight carbon footprint between Bella Coola Airport (QBC) and Fort St. John Airport (YXJ)
On average, flying from Bella Coola to Fort St.John generates about 77 kg of CO2 per passenger, and 77 kilograms equals 171 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bella Coola to Fort St.John
See the map of the shortest flight path between Bella Coola Airport (QBC) and Fort St. John Airport (YXJ).
Airport information
Origin | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |
Destination | Fort St. John Airport |
---|---|
City: | Fort St.John |
Country: | Canada |
IATA Code: | YXJ |
ICAO Code: | CYXJ |
Coordinates: | 56°14′17″N, 120°44′23″W |