How far is Brandon from Porto Alegre?
The distance between Porto Alegre (Salgado Filho Porto Alegre International Airport) and Brandon (Brandon Municipal Airport) is 6261 miles / 10076 kilometers / 5441 nautical miles.
Salgado Filho Porto Alegre International Airport – Brandon Municipal Airport
Search flights
Distance from Porto Alegre to Brandon
There are several ways to calculate the distance from Porto Alegre to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 6260.883 miles
- 10075.914 kilometers
- 5440.558 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6277.513 miles
- 10102.678 kilometers
- 5455.010 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Porto Alegre to Brandon?
The estimated flight time from Salgado Filho Porto Alegre International Airport to Brandon Municipal Airport is 12 hours and 21 minutes.
What is the time difference between Porto Alegre and Brandon?
Flight carbon footprint between Salgado Filho Porto Alegre International Airport (POA) and Brandon Municipal Airport (YBR)
On average, flying from Porto Alegre to Brandon generates about 752 kg of CO2 per passenger, and 752 kilograms equals 1 658 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Porto Alegre to Brandon
See the map of the shortest flight path between Salgado Filho Porto Alegre International Airport (POA) and Brandon Municipal Airport (YBR).
Airport information
Origin | Salgado Filho Porto Alegre International Airport |
---|---|
City: | Porto Alegre |
Country: | Brazil |
IATA Code: | POA |
ICAO Code: | SBPA |
Coordinates: | 29°59′39″S, 51°10′17″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |