How far is Sarajevo from Luqa?
The distance between Luqa (Malta International Airport) and Sarajevo (Sarajevo International Airport) is 586 miles / 944 kilometers / 510 nautical miles.
The driving distance from Luqa (MLA) to Sarajevo (SJJ) is 760 miles / 1223 kilometers, and travel time by car is about 25 hours 44 minutes.
Malta International Airport – Sarajevo International Airport
Search flights
Distance from Luqa to Sarajevo
There are several ways to calculate the distance from Luqa to Sarajevo. Here are two standard methods:
Vincenty's formula (applied above)- 586.456 miles
- 943.809 kilometers
- 509.616 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 587.036 miles
- 944.743 kilometers
- 510.121 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Luqa to Sarajevo?
The estimated flight time from Malta International Airport to Sarajevo International Airport is 1 hour and 36 minutes.
What is the time difference between Luqa and Sarajevo?
Flight carbon footprint between Malta International Airport (MLA) and Sarajevo International Airport (SJJ)
On average, flying from Luqa to Sarajevo generates about 111 kg of CO2 per passenger, and 111 kilograms equals 245 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Luqa to Sarajevo
See the map of the shortest flight path between Malta International Airport (MLA) and Sarajevo International Airport (SJJ).
Airport information
Origin | Malta International Airport |
---|---|
City: | Luqa |
Country: | Malta |
IATA Code: | MLA |
ICAO Code: | LMML |
Coordinates: | 35°51′26″N, 14°28′39″E |
Destination | Sarajevo International Airport |
---|---|
City: | Sarajevo |
Country: | Bosnia and Herzegovina |
IATA Code: | SJJ |
ICAO Code: | LQSA |
Coordinates: | 43°49′28″N, 18°19′53″E |