How far is Brandon from Wunnumin Lake?
The distance between Wunnumin Lake (Wunnumin Lake Airport) and Brandon (Brandon Municipal Airport) is 504 miles / 812 kilometers / 438 nautical miles.
The driving distance from Wunnumin Lake (WNN) to Brandon (YBR) is 693 miles / 1116 kilometers, and travel time by car is about 21 hours 20 minutes.
Wunnumin Lake Airport – Brandon Municipal Airport
Search flights
Distance from Wunnumin Lake to Brandon
There are several ways to calculate the distance from Wunnumin Lake to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 504.462 miles
- 811.854 kilometers
- 438.366 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 503.089 miles
- 809.643 kilometers
- 437.172 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wunnumin Lake to Brandon?
The estimated flight time from Wunnumin Lake Airport to Brandon Municipal Airport is 1 hour and 27 minutes.
What is the time difference between Wunnumin Lake and Brandon?
Flight carbon footprint between Wunnumin Lake Airport (WNN) and Brandon Municipal Airport (YBR)
On average, flying from Wunnumin Lake to Brandon generates about 99 kg of CO2 per passenger, and 99 kilograms equals 219 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wunnumin Lake to Brandon
See the map of the shortest flight path between Wunnumin Lake Airport (WNN) and Brandon Municipal Airport (YBR).
Airport information
Origin | Wunnumin Lake Airport |
---|---|
City: | Wunnumin Lake |
Country: | Canada |
IATA Code: | WNN |
ICAO Code: | CKL3 |
Coordinates: | 52°53′38″N, 89°17′21″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |