How far is Bella Coola from Poplar Hill?
The distance between Poplar Hill (Poplar Hill Airport) and Bella Coola (Bella Coola Airport) is 1361 miles / 2190 kilometers / 1183 nautical miles.
The driving distance from Poplar Hill (YHP) to Bella Coola (QBC) is 2017 miles / 3246 kilometers, and travel time by car is about 44 hours 55 minutes.
Poplar Hill Airport – Bella Coola Airport
Search flights
Distance from Poplar Hill to Bella Coola
There are several ways to calculate the distance from Poplar Hill to Bella Coola. Here are two standard methods:
Vincenty's formula (applied above)- 1361.024 miles
- 2190.356 kilometers
- 1182.698 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1356.656 miles
- 2183.327 kilometers
- 1178.902 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Poplar Hill to Bella Coola?
The estimated flight time from Poplar Hill Airport to Bella Coola Airport is 3 hours and 4 minutes.
What is the time difference between Poplar Hill and Bella Coola?
Flight carbon footprint between Poplar Hill Airport (YHP) and Bella Coola Airport (QBC)
On average, flying from Poplar Hill to Bella Coola generates about 171 kg of CO2 per passenger, and 171 kilograms equals 376 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Poplar Hill to Bella Coola
See the map of the shortest flight path between Poplar Hill Airport (YHP) and Bella Coola Airport (QBC).
Airport information
Origin | Poplar Hill Airport |
---|---|
City: | Poplar Hill |
Country: | Canada |
IATA Code: | YHP |
ICAO Code: | CPV7 |
Coordinates: | 52°6′47″N, 94°15′20″W |
Destination | Bella Coola Airport |
---|---|
City: | Bella Coola |
Country: | Canada |
IATA Code: | QBC |
ICAO Code: | CYBD |
Coordinates: | 52°23′15″N, 126°35′45″W |