How far is Ljubljana from Aqaba?
The distance between Aqaba (King Hussein International Airport) and Ljubljana (Ljubljana Jože Pučnik Airport) is 1595 miles / 2567 kilometers / 1386 nautical miles.
King Hussein International Airport – Ljubljana Jože Pučnik Airport
Search flights
Distance from Aqaba to Ljubljana
There are several ways to calculate the distance from Aqaba to Ljubljana. Here are two standard methods:
Vincenty's formula (applied above)- 1595.131 miles
- 2567.114 kilometers
- 1386.131 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1594.774 miles
- 2566.540 kilometers
- 1385.821 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Aqaba to Ljubljana?
The estimated flight time from King Hussein International Airport to Ljubljana Jože Pučnik Airport is 3 hours and 31 minutes.
What is the time difference between Aqaba and Ljubljana?
The time difference between Aqaba and Ljubljana is 2 hours. Ljubljana is 2 hours behind Aqaba.
Flight carbon footprint between King Hussein International Airport (AQJ) and Ljubljana Jože Pučnik Airport (LJU)
On average, flying from Aqaba to Ljubljana generates about 186 kg of CO2 per passenger, and 186 kilograms equals 409 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Aqaba to Ljubljana
See the map of the shortest flight path between King Hussein International Airport (AQJ) and Ljubljana Jože Pučnik Airport (LJU).
Airport information
Origin | King Hussein International Airport |
---|---|
City: | Aqaba |
Country: | Jordan |
IATA Code: | AQJ |
ICAO Code: | OJAQ |
Coordinates: | 29°36′41″N, 35°1′5″E |
Destination | Ljubljana Jože Pučnik Airport |
---|---|
City: | Ljubljana |
Country: | Slovenia |
IATA Code: | LJU |
ICAO Code: | LJLJ |
Coordinates: | 46°13′25″N, 14°27′27″E |