Air Miles Calculator logo

How far is Bella Coola from Charleston, WV?

The distance between Charleston (Yeager Airport) and Bella Coola (Bella Coola Airport) is 2347 miles / 3777 kilometers / 2040 nautical miles.

The driving distance from Charleston (CRW) to Bella Coola (QBC) is 2950 miles / 4747 kilometers, and travel time by car is about 58 hours 37 minutes.

Yeager Airport – Bella Coola Airport

Distance arrow
2347
Miles
Distance arrow
3777
Kilometers
Distance arrow
2040
Nautical miles

Search flights

Distance from Charleston to Bella Coola

There are several ways to calculate the distance from Charleston to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 2347.063 miles
  • 3777.232 kilometers
  • 2039.542 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2341.823 miles
  • 3768.799 kilometers
  • 2034.989 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charleston to Bella Coola?

The estimated flight time from Yeager Airport to Bella Coola Airport is 4 hours and 56 minutes.

Flight carbon footprint between Yeager Airport (CRW) and Bella Coola Airport (QBC)

On average, flying from Charleston to Bella Coola generates about 257 kg of CO2 per passenger, and 257 kilograms equals 567 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Charleston to Bella Coola

See the map of the shortest flight path between Yeager Airport (CRW) and Bella Coola Airport (QBC).

Airport information

Origin Yeager Airport
City: Charleston, WV
Country: United States Flag of United States
IATA Code: CRW
ICAO Code: KCRW
Coordinates: 38°22′23″N, 81°35′35″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W