Air Miles Calculator logo

How far is Bella Coola from Bangui?

The distance between Bangui (Bangui M'Poko International Airport) and Bella Coola (Bella Coola Airport) is 8021 miles / 12909 kilometers / 6970 nautical miles.

Bangui M'Poko International Airport – Bella Coola Airport

Distance arrow
8021
Miles
Distance arrow
12909
Kilometers
Distance arrow
6970
Nautical miles
Flight time duration
15 h 41 min
CO2 emission
1 002 kg

Search flights

Distance from Bangui to Bella Coola

There are several ways to calculate the distance from Bangui to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 8021.431 miles
  • 12909.242 kilometers
  • 6970.433 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8015.037 miles
  • 12898.951 kilometers
  • 6964.877 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangui to Bella Coola?

The estimated flight time from Bangui M'Poko International Airport to Bella Coola Airport is 15 hours and 41 minutes.

Flight carbon footprint between Bangui M'Poko International Airport (BGF) and Bella Coola Airport (QBC)

On average, flying from Bangui to Bella Coola generates about 1 002 kg of CO2 per passenger, and 1 002 kilograms equals 2 208 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bangui to Bella Coola

See the map of the shortest flight path between Bangui M'Poko International Airport (BGF) and Bella Coola Airport (QBC).

Airport information

Origin Bangui M'Poko International Airport
City: Bangui
Country: Central African Republic Flag of Central African Republic
IATA Code: BGF
ICAO Code: FEFF
Coordinates: 4°23′54″N, 18°31′7″E
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W