How far is Bella Coola from Fort St.John?
The distance between Fort St.John (Fort St. John Airport) and Bella Coola (Bella Coola Airport) is 356 miles / 573 kilometers / 309 nautical miles.
The driving distance from Fort St.John (YXJ) to Bella Coola (QBC) is 691 miles / 1112 kilometers, and travel time by car is about 15 hours 42 minutes.
Fort St. John Airport – Bella Coola Airport
Search flights
Distance from Fort St.John to Bella Coola
There are several ways to calculate the distance from Fort St.John to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 356.144 miles
- 573.158 kilometers
- 309.481 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 355.416 miles
- 571.986 kilometers
- 308.848 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Fort St.John to Bella Coola?
The estimated flight time from Fort St. John Airport to Bella Coola Airport is 1 hour and 10 minutes.
What is the time difference between Fort St.John and Bella Coola?
Flight carbon footprint between Fort St. John Airport (YXJ) and Bella Coola Airport (QBC)
On average, flying from Fort St.John to Bella Coola generates about 77 kg of CO2 per passenger, and 77 kilograms equals 171 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Fort St.John to Bella Coola
See the map of the shortest flight path between Fort St. John Airport (YXJ) and Bella Coola Airport (QBC).
Airport information
Origin | Fort St. John Airport |
---|---|
City: | Fort St.John |
Country: | Canada |
IATA Code: | YXJ |
ICAO Code: | CYXJ |
Coordinates: | 56°14′17″N, 120°44′23″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |