How far is Brandon from Haines, AK?
The distance between Haines (Haines Airport) and Brandon (Brandon Municipal Airport) is 1544 miles / 2485 kilometers / 1342 nautical miles.
The driving distance from Haines (HNS) to Brandon (YBR) is 2018 miles / 3247 kilometers, and travel time by car is about 39 hours 54 minutes.
Haines Airport – Brandon Municipal Airport
Search flights
Distance from Haines to Brandon
There are several ways to calculate the distance from Haines to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 1544.347 miles
- 2485.385 kilometers
- 1342.000 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1539.816 miles
- 2478.093 kilometers
- 1338.063 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Haines to Brandon?
The estimated flight time from Haines Airport to Brandon Municipal Airport is 3 hours and 25 minutes.
What is the time difference between Haines and Brandon?
The time difference between Haines and Brandon is 3 hours. Brandon is 3 hours ahead of Haines.
Flight carbon footprint between Haines Airport (HNS) and Brandon Municipal Airport (YBR)
On average, flying from Haines to Brandon generates about 182 kg of CO2 per passenger, and 182 kilograms equals 402 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Haines to Brandon
See the map of the shortest flight path between Haines Airport (HNS) and Brandon Municipal Airport (YBR).
Airport information
Origin | Haines Airport |
---|---|
City: | Haines, AK |
Country: | United States |
IATA Code: | HNS |
ICAO Code: | PAHN |
Coordinates: | 59°14′37″N, 135°31′26″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |