Air Miles Calculator logo

How far is Aqaba from Tokyo?

The distance between Tokyo (Haneda Airport) and Aqaba (King Hussein International Airport) is 5805 miles / 9343 kilometers / 5045 nautical miles.

The driving distance from Tokyo (HND) to Aqaba (AQJ) is 7333 miles / 11802 kilometers, and travel time by car is about 143 hours 52 minutes.

Haneda Airport – King Hussein International Airport

Distance arrow
5805
Miles
Distance arrow
9343
Kilometers
Distance arrow
5045
Nautical miles

Search flights

Distance from Tokyo to Aqaba

There are several ways to calculate the distance from Tokyo to Aqaba. Here are two standard methods:

Vincenty's formula (applied above)
  • 5805.340 miles
  • 9342.789 kilometers
  • 5044.703 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5793.713 miles
  • 9324.078 kilometers
  • 5034.599 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tokyo to Aqaba?

The estimated flight time from Haneda Airport to King Hussein International Airport is 11 hours and 29 minutes.

Flight carbon footprint between Haneda Airport (HND) and King Hussein International Airport (AQJ)

On average, flying from Tokyo to Aqaba generates about 690 kg of CO2 per passenger, and 690 kilograms equals 1 522 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tokyo to Aqaba

See the map of the shortest flight path between Haneda Airport (HND) and King Hussein International Airport (AQJ).

Airport information

Origin Haneda Airport
City: Tokyo
Country: Japan Flag of Japan
IATA Code: HND
ICAO Code: RJTT
Coordinates: 35°33′8″N, 139°46′47″E
Destination King Hussein International Airport
City: Aqaba
Country: Jordan Flag of Jordan
IATA Code: AQJ
ICAO Code: OJAQ
Coordinates: 29°36′41″N, 35°1′5″E