How far is Prince Albert from Banja Luka?
The distance between Banja Luka (Banja Luka International Airport) and Prince Albert (Prince Albert (Glass Field) Airport) is 4880 miles / 7854 kilometers / 4241 nautical miles.
Banja Luka International Airport – Prince Albert (Glass Field) Airport
Search flights
Distance from Banja Luka to Prince Albert
There are several ways to calculate the distance from Banja Luka to Prince Albert. Here are two standard methods:
Vincenty's formula (applied above)- 4880.137 miles
- 7853.820 kilometers
- 4240.723 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4865.835 miles
- 7830.802 kilometers
- 4228.295 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Banja Luka to Prince Albert?
The estimated flight time from Banja Luka International Airport to Prince Albert (Glass Field) Airport is 9 hours and 44 minutes.
What is the time difference between Banja Luka and Prince Albert?
Flight carbon footprint between Banja Luka International Airport (BNX) and Prince Albert (Glass Field) Airport (YPA)
On average, flying from Banja Luka to Prince Albert generates about 568 kg of CO2 per passenger, and 568 kilograms equals 1 253 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Banja Luka to Prince Albert
See the map of the shortest flight path between Banja Luka International Airport (BNX) and Prince Albert (Glass Field) Airport (YPA).
Airport information
Origin | Banja Luka International Airport |
---|---|
City: | Banja Luka |
Country: | Bosnia and Herzegovina |
IATA Code: | BNX |
ICAO Code: | LQBK |
Coordinates: | 44°56′29″N, 17°17′51″E |
Destination | Prince Albert (Glass Field) Airport |
---|---|
City: | Prince Albert |
Country: | Canada |
IATA Code: | YPA |
ICAO Code: | CYPA |
Coordinates: | 53°12′51″N, 105°40′22″W |