How far is Windsor from Banja Luka?
The distance between Banja Luka (Banja Luka International Airport) and Windsor (Windsor International Airport) is 4679 miles / 7531 kilometers / 4066 nautical miles.
Banja Luka International Airport – Windsor International Airport
Search flights
Distance from Banja Luka to Windsor
There are several ways to calculate the distance from Banja Luka to Windsor. Here are two standard methods:
Vincenty's formula (applied above)- 4679.397 miles
- 7530.760 kilometers
- 4066.285 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4666.910 miles
- 7510.663 kilometers
- 4055.434 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Banja Luka to Windsor?
The estimated flight time from Banja Luka International Airport to Windsor International Airport is 9 hours and 21 minutes.
What is the time difference between Banja Luka and Windsor?
The time difference between Banja Luka and Windsor is 6 hours. Windsor is 6 hours behind Banja Luka.
Flight carbon footprint between Banja Luka International Airport (BNX) and Windsor International Airport (YQG)
On average, flying from Banja Luka to Windsor generates about 542 kg of CO2 per passenger, and 542 kilograms equals 1 196 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Banja Luka to Windsor
See the map of the shortest flight path between Banja Luka International Airport (BNX) and Windsor International Airport (YQG).
Airport information
Origin | Banja Luka International Airport |
---|---|
City: | Banja Luka |
Country: | Bosnia and Herzegovina |
IATA Code: | BNX |
ICAO Code: | LQBK |
Coordinates: | 44°56′29″N, 17°17′51″E |
Destination | Windsor International Airport |
---|---|
City: | Windsor |
Country: | Canada |
IATA Code: | YQG |
ICAO Code: | CYQG |
Coordinates: | 42°16′32″N, 82°57′20″W |