Air Miles Calculator logo

How far is Jambi from Abu Dhabi?

The distance between Abu Dhabi (Abu Dhabi International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 3742 miles / 6023 kilometers / 3252 nautical miles.

Abu Dhabi International Airport – Sultan Thaha Syaifuddin Airport

Distance arrow
3742
Miles
Distance arrow
6023
Kilometers
Distance arrow
3252
Nautical miles

Search flights

Distance from Abu Dhabi to Jambi

There are several ways to calculate the distance from Abu Dhabi to Jambi. Here are two standard methods:

Vincenty's formula (applied above)
  • 3742.454 miles
  • 6022.897 kilometers
  • 3252.104 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3743.077 miles
  • 6023.899 kilometers
  • 3252.645 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abu Dhabi to Jambi?

The estimated flight time from Abu Dhabi International Airport to Sultan Thaha Syaifuddin Airport is 7 hours and 35 minutes.

Flight carbon footprint between Abu Dhabi International Airport (AUH) and Sultan Thaha Syaifuddin Airport (DJB)

On average, flying from Abu Dhabi to Jambi generates about 424 kg of CO2 per passenger, and 424 kilograms equals 936 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Abu Dhabi to Jambi

See the map of the shortest flight path between Abu Dhabi International Airport (AUH) and Sultan Thaha Syaifuddin Airport (DJB).

Airport information

Origin Abu Dhabi International Airport
City: Abu Dhabi
Country: United Arab Emirates Flag of United Arab Emirates
IATA Code: AUH
ICAO Code: OMAA
Coordinates: 24°25′58″N, 54°39′3″E
Destination Sultan Thaha Syaifuddin Airport
City: Jambi
Country: Indonesia Flag of Indonesia
IATA Code: DJB
ICAO Code: WIPA
Coordinates: 1°38′16″S, 103°38′38″E