How far is Bella Coola from Rankin Inlet?
The distance between Rankin Inlet (Rankin Inlet Airport) and Bella Coola (Bella Coola Airport) is 1445 miles / 2326 kilometers / 1256 nautical miles.
The driving distance from Rankin Inlet (YRT) to Bella Coola (QBC) is 1891 miles / 3043 kilometers, and travel time by car is about 44 hours 29 minutes.
Rankin Inlet Airport – Bella Coola Airport
Search flights
Distance from Rankin Inlet to Bella Coola
There are several ways to calculate the distance from Rankin Inlet to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 1445.057 miles
- 2325.593 kilometers
- 1255.720 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1440.711 miles
- 2318.600 kilometers
- 1251.944 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Rankin Inlet to Bella Coola?
The estimated flight time from Rankin Inlet Airport to Bella Coola Airport is 3 hours and 14 minutes.
What is the time difference between Rankin Inlet and Bella Coola?
Flight carbon footprint between Rankin Inlet Airport (YRT) and Bella Coola Airport (QBC)
On average, flying from Rankin Inlet to Bella Coola generates about 176 kg of CO2 per passenger, and 176 kilograms equals 388 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Rankin Inlet to Bella Coola
See the map of the shortest flight path between Rankin Inlet Airport (YRT) and Bella Coola Airport (QBC).
Airport information
Origin | Rankin Inlet Airport |
---|---|
City: | Rankin Inlet |
Country: | Canada |
IATA Code: | YRT |
ICAO Code: | CYRT |
Coordinates: | 62°48′41″N, 92°6′56″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |