Air Miles Calculator logo

How far is Bella Coola from Antofagasta?

The distance between Antofagasta (Andrés Sabella Gálvez International Airport) and Bella Coola (Bella Coola Airport) is 6218 miles / 10006 kilometers / 5403 nautical miles.

Andrés Sabella Gálvez International Airport – Bella Coola Airport

Distance arrow
6218
Miles
Distance arrow
10006
Kilometers
Distance arrow
5403
Nautical miles

Search flights

Distance from Antofagasta to Bella Coola

There are several ways to calculate the distance from Antofagasta to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 6217.685 miles
  • 10006.395 kilometers
  • 5403.021 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6231.395 miles
  • 10028.458 kilometers
  • 5414.934 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Antofagasta to Bella Coola?

The estimated flight time from Andrés Sabella Gálvez International Airport to Bella Coola Airport is 12 hours and 16 minutes.

Flight carbon footprint between Andrés Sabella Gálvez International Airport (ANF) and Bella Coola Airport (QBC)

On average, flying from Antofagasta to Bella Coola generates about 746 kg of CO2 per passenger, and 746 kilograms equals 1 645 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Antofagasta to Bella Coola

See the map of the shortest flight path between Andrés Sabella Gálvez International Airport (ANF) and Bella Coola Airport (QBC).

Airport information

Origin Andrés Sabella Gálvez International Airport
City: Antofagasta
Country: Chile Flag of Chile
IATA Code: ANF
ICAO Code: SCFA
Coordinates: 23°26′40″S, 70°26′42″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W