Air Miles Calculator logo

How far is Banja Luka from Tripoli?

The distance between Tripoli (Mitiga International Airport) and Banja Luka (Banja Luka International Airport) is 859 miles / 1382 kilometers / 746 nautical miles.

Mitiga International Airport – Banja Luka International Airport

Distance arrow
859
Miles
Distance arrow
1382
Kilometers
Distance arrow
746
Nautical miles

Search flights

Distance from Tripoli to Banja Luka

There are several ways to calculate the distance from Tripoli to Banja Luka. Here are two standard methods:

Vincenty's formula (applied above)
  • 858.529 miles
  • 1381.668 kilometers
  • 746.041 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 859.704 miles
  • 1383.559 kilometers
  • 747.062 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tripoli to Banja Luka?

The estimated flight time from Mitiga International Airport to Banja Luka International Airport is 2 hours and 7 minutes.

Flight carbon footprint between Mitiga International Airport (MJI) and Banja Luka International Airport (BNX)

On average, flying from Tripoli to Banja Luka generates about 140 kg of CO2 per passenger, and 140 kilograms equals 309 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tripoli to Banja Luka

See the map of the shortest flight path between Mitiga International Airport (MJI) and Banja Luka International Airport (BNX).

Airport information

Origin Mitiga International Airport
City: Tripoli
Country: Libya Flag of Libya
IATA Code: MJI
ICAO Code: HLLM
Coordinates: 32°53′38″N, 13°16′33″E
Destination Banja Luka International Airport
City: Banja Luka
Country: Bosnia and Herzegovina Flag of Bosnia and Herzegovina
IATA Code: BNX
ICAO Code: LQBK
Coordinates: 44°56′29″N, 17°17′51″E