Air Miles Calculator logo

How far is Jambi from Changchun?

The distance between Changchun (Changchun Longjia International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 3424 miles / 5511 kilometers / 2975 nautical miles.

Changchun Longjia International Airport – Sultan Thaha Syaifuddin Airport

Distance arrow
3424
Miles
Distance arrow
5511
Kilometers
Distance arrow
2975
Nautical miles

Search flights

Distance from Changchun to Jambi

There are several ways to calculate the distance from Changchun to Jambi. Here are two standard methods:

Vincenty's formula (applied above)
  • 3424.108 miles
  • 5510.568 kilometers
  • 2975.469 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3434.502 miles
  • 5527.295 kilometers
  • 2984.501 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Changchun to Jambi?

The estimated flight time from Changchun Longjia International Airport to Sultan Thaha Syaifuddin Airport is 6 hours and 58 minutes.

Flight carbon footprint between Changchun Longjia International Airport (CGQ) and Sultan Thaha Syaifuddin Airport (DJB)

On average, flying from Changchun to Jambi generates about 385 kg of CO2 per passenger, and 385 kilograms equals 850 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Changchun to Jambi

See the map of the shortest flight path between Changchun Longjia International Airport (CGQ) and Sultan Thaha Syaifuddin Airport (DJB).

Airport information

Origin Changchun Longjia International Airport
City: Changchun
Country: China Flag of China
IATA Code: CGQ
ICAO Code: ZYCC
Coordinates: 43°59′46″N, 125°41′5″E
Destination Sultan Thaha Syaifuddin Airport
City: Jambi
Country: Indonesia Flag of Indonesia
IATA Code: DJB
ICAO Code: WIPA
Coordinates: 1°38′16″S, 103°38′38″E