Air Miles Calculator logo

How far is Brandon from Wrangell, AK?

The distance between Wrangell (Wrangell Airport) and Brandon (Brandon Municipal Airport) is 1405 miles / 2262 kilometers / 1221 nautical miles.

The driving distance from Wrangell (WRG) to Brandon (YBR) is 1834 miles / 2952 kilometers, and travel time by car is about 44 hours 37 minutes.

Wrangell Airport – Brandon Municipal Airport

Distance arrow
1405
Miles
Distance arrow
2262
Kilometers
Distance arrow
1221
Nautical miles

Search flights

Distance from Wrangell to Brandon

There are several ways to calculate the distance from Wrangell to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1405.453 miles
  • 2261.857 kilometers
  • 1221.305 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1401.229 miles
  • 2255.059 kilometers
  • 1217.635 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wrangell to Brandon?

The estimated flight time from Wrangell Airport to Brandon Municipal Airport is 3 hours and 9 minutes.

Flight carbon footprint between Wrangell Airport (WRG) and Brandon Municipal Airport (YBR)

On average, flying from Wrangell to Brandon generates about 174 kg of CO2 per passenger, and 174 kilograms equals 383 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wrangell to Brandon

See the map of the shortest flight path between Wrangell Airport (WRG) and Brandon Municipal Airport (YBR).

Airport information

Origin Wrangell Airport
City: Wrangell, AK
Country: United States Flag of United States
IATA Code: WRG
ICAO Code: PAWG
Coordinates: 56°29′3″N, 132°22′11″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W