Air Miles Calculator logo

How far is Brandon from Grayling, AK?

The distance between Grayling (Grayling Airport) and Brandon (Brandon Municipal Airport) is 2369 miles / 3813 kilometers / 2059 nautical miles.

The driving distance from Grayling (KGX) to Brandon (YBR) is 3096 miles / 4983 kilometers, and travel time by car is about 113 hours 54 minutes.

Grayling Airport – Brandon Municipal Airport

Distance arrow
2369
Miles
Distance arrow
3813
Kilometers
Distance arrow
2059
Nautical miles

Search flights

Distance from Grayling to Brandon

There are several ways to calculate the distance from Grayling to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 2369.447 miles
  • 3813.255 kilometers
  • 2058.993 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2362.060 miles
  • 3801.367 kilometers
  • 2052.574 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Grayling to Brandon?

The estimated flight time from Grayling Airport to Brandon Municipal Airport is 4 hours and 59 minutes.

Flight carbon footprint between Grayling Airport (KGX) and Brandon Municipal Airport (YBR)

On average, flying from Grayling to Brandon generates about 260 kg of CO2 per passenger, and 260 kilograms equals 573 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Grayling to Brandon

See the map of the shortest flight path between Grayling Airport (KGX) and Brandon Municipal Airport (YBR).

Airport information

Origin Grayling Airport
City: Grayling, AK
Country: United States Flag of United States
IATA Code: KGX
ICAO Code: PAGX
Coordinates: 62°53′42″N, 160°3′58″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W