How far is Brandon from Banja Luka?
The distance between Banja Luka (Banja Luka International Airport) and Brandon (Brandon Municipal Airport) is 4894 miles / 7876 kilometers / 4253 nautical miles.
Banja Luka International Airport – Brandon Municipal Airport
Search flights
Distance from Banja Luka to Brandon
There are several ways to calculate the distance from Banja Luka to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 4893.972 miles
- 7876.085 kilometers
- 4252.746 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4879.947 miles
- 7853.513 kilometers
- 4240.558 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Banja Luka to Brandon?
The estimated flight time from Banja Luka International Airport to Brandon Municipal Airport is 9 hours and 45 minutes.
What is the time difference between Banja Luka and Brandon?
The time difference between Banja Luka and Brandon is 7 hours. Brandon is 7 hours behind Banja Luka.
Flight carbon footprint between Banja Luka International Airport (BNX) and Brandon Municipal Airport (YBR)
On average, flying from Banja Luka to Brandon generates about 570 kg of CO2 per passenger, and 570 kilograms equals 1 257 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Banja Luka to Brandon
See the map of the shortest flight path between Banja Luka International Airport (BNX) and Brandon Municipal Airport (YBR).
Airport information
Origin | Banja Luka International Airport |
---|---|
City: | Banja Luka |
Country: | Bosnia and Herzegovina |
IATA Code: | BNX |
ICAO Code: | LQBK |
Coordinates: | 44°56′29″N, 17°17′51″E |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |