Air Miles Calculator logo

How far is Jiujiang from Bangkok?

The distance between Bangkok (Suvarnabhumi Airport) and Jiujiang (Jiujiang Lushan Airport) is 1473 miles / 2370 kilometers / 1280 nautical miles.

The driving distance from Bangkok (BKK) to Jiujiang (JIU) is 1900 miles / 3058 kilometers, and travel time by car is about 36 hours 15 minutes.

Suvarnabhumi Airport – Jiujiang Lushan Airport

Distance arrow
1473
Miles
Distance arrow
2370
Kilometers
Distance arrow
1280
Nautical miles

Search flights

Distance from Bangkok to Jiujiang

There are several ways to calculate the distance from Bangkok to Jiujiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1472.882 miles
  • 2370.374 kilometers
  • 1279.899 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1475.332 miles
  • 2374.318 kilometers
  • 1282.029 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangkok to Jiujiang?

The estimated flight time from Suvarnabhumi Airport to Jiujiang Lushan Airport is 3 hours and 17 minutes.

Flight carbon footprint between Suvarnabhumi Airport (BKK) and Jiujiang Lushan Airport (JIU)

On average, flying from Bangkok to Jiujiang generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bangkok to Jiujiang

See the map of the shortest flight path between Suvarnabhumi Airport (BKK) and Jiujiang Lushan Airport (JIU).

Airport information

Origin Suvarnabhumi Airport
City: Bangkok
Country: Thailand Flag of Thailand
IATA Code: BKK
ICAO Code: VTBS
Coordinates: 13°40′51″N, 100°44′49″E
Destination Jiujiang Lushan Airport
City: Jiujiang
Country: China Flag of China
IATA Code: JIU
ICAO Code: ZSJJ
Coordinates: 29°43′58″N, 115°58′58″E