How far is Summer Beaver from Brandon?
The distance between Brandon (Brandon Municipal Airport) and Summer Beaver (Summer Beaver Airport) is 530 miles / 853 kilometers / 461 nautical miles.
The driving distance from Brandon (YBR) to Summer Beaver (SUR) is 620 miles / 998 kilometers, and travel time by car is about 16 hours 39 minutes.
Brandon Municipal Airport – Summer Beaver Airport
Search flights
Distance from Brandon to Summer Beaver
There are several ways to calculate the distance from Brandon to Summer Beaver. Here are two standard methods:
Vincenty's formula (applied above)- 530.158 miles
- 853.206 kilometers
- 460.694 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 528.670 miles
- 850.812 kilometers
- 459.402 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Brandon to Summer Beaver?
The estimated flight time from Brandon Municipal Airport to Summer Beaver Airport is 1 hour and 30 minutes.
What is the time difference between Brandon and Summer Beaver?
Flight carbon footprint between Brandon Municipal Airport (YBR) and Summer Beaver Airport (SUR)
On average, flying from Brandon to Summer Beaver generates about 103 kg of CO2 per passenger, and 103 kilograms equals 227 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Brandon to Summer Beaver
See the map of the shortest flight path between Brandon Municipal Airport (YBR) and Summer Beaver Airport (SUR).
Airport information
Origin | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |
Destination | Summer Beaver Airport |
---|---|
City: | Summer Beaver |
Country: | Canada |
IATA Code: | SUR |
ICAO Code: | CJV7 |
Coordinates: | 52°42′30″N, 88°32′30″W |