Air Miles Calculator logo

How far is Jiujiang from Kota Kinabalu?

The distance between Kota Kinabalu (Kota Kinabalu International Airport) and Jiujiang (Jiujiang Lushan Airport) is 1637 miles / 2634 kilometers / 1422 nautical miles.

Kota Kinabalu International Airport – Jiujiang Lushan Airport

Distance arrow
1637
Miles
Distance arrow
2634
Kilometers
Distance arrow
1422
Nautical miles

Search flights

Distance from Kota Kinabalu to Jiujiang

There are several ways to calculate the distance from Kota Kinabalu to Jiujiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1636.692 miles
  • 2634.000 kilometers
  • 1422.246 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1644.138 miles
  • 2645.984 kilometers
  • 1428.717 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kota Kinabalu to Jiujiang?

The estimated flight time from Kota Kinabalu International Airport to Jiujiang Lushan Airport is 3 hours and 35 minutes.

What is the time difference between Kota Kinabalu and Jiujiang?

There is no time difference between Kota Kinabalu and Jiujiang.

Flight carbon footprint between Kota Kinabalu International Airport (BKI) and Jiujiang Lushan Airport (JIU)

On average, flying from Kota Kinabalu to Jiujiang generates about 188 kg of CO2 per passenger, and 188 kilograms equals 415 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kota Kinabalu to Jiujiang

See the map of the shortest flight path between Kota Kinabalu International Airport (BKI) and Jiujiang Lushan Airport (JIU).

Airport information

Origin Kota Kinabalu International Airport
City: Kota Kinabalu
Country: Malaysia Flag of Malaysia
IATA Code: BKI
ICAO Code: WBKK
Coordinates: 5°56′13″N, 116°3′3″E
Destination Jiujiang Lushan Airport
City: Jiujiang
Country: China Flag of China
IATA Code: JIU
ICAO Code: ZSJJ
Coordinates: 29°43′58″N, 115°58′58″E