How far is Brandon from Holy Cross, AK?
The distance between Holy Cross (Holy Cross Airport) and Brandon (Brandon Municipal Airport) is 2366 miles / 3808 kilometers / 2056 nautical miles.
The driving distance from Holy Cross (HCR) to Brandon (YBR) is 3095 miles / 4981 kilometers, and travel time by car is about 113 hours 45 minutes.
Holy Cross Airport – Brandon Municipal Airport
Search flights
Distance from Holy Cross to Brandon
There are several ways to calculate the distance from Holy Cross to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 2366.051 miles
- 3807.791 kilometers
- 2056.042 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2358.653 miles
- 3795.884 kilometers
- 2049.614 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Holy Cross to Brandon?
The estimated flight time from Holy Cross Airport to Brandon Municipal Airport is 4 hours and 58 minutes.
What is the time difference between Holy Cross and Brandon?
Flight carbon footprint between Holy Cross Airport (HCR) and Brandon Municipal Airport (YBR)
On average, flying from Holy Cross to Brandon generates about 260 kg of CO2 per passenger, and 260 kilograms equals 572 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Holy Cross to Brandon
See the map of the shortest flight path between Holy Cross Airport (HCR) and Brandon Municipal Airport (YBR).
Airport information
Origin | Holy Cross Airport |
---|---|
City: | Holy Cross, AK |
Country: | United States |
IATA Code: | HCR |
ICAO Code: | PAHC |
Coordinates: | 62°11′17″N, 159°46′29″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |