Air Miles Calculator logo

How far is Ji'an from Xieng Khouang?

The distance between Xieng Khouang (Xieng Khouang Airport) and Ji'an (Jinggangshan Airport) is 895 miles / 1441 kilometers / 778 nautical miles.

The driving distance from Xieng Khouang (XKH) to Ji'an (JGS) is 1224 miles / 1970 kilometers, and travel time by car is about 23 hours 21 minutes.

Xieng Khouang Airport – Jinggangshan Airport

Distance arrow
895
Miles
Distance arrow
1441
Kilometers
Distance arrow
778
Nautical miles

Search flights

Distance from Xieng Khouang to Ji'an

There are several ways to calculate the distance from Xieng Khouang to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 895.107 miles
  • 1440.535 kilometers
  • 777.826 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 895.294 miles
  • 1440.836 kilometers
  • 777.989 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xieng Khouang to Ji'an?

The estimated flight time from Xieng Khouang Airport to Jinggangshan Airport is 2 hours and 11 minutes.

Flight carbon footprint between Xieng Khouang Airport (XKH) and Jinggangshan Airport (JGS)

On average, flying from Xieng Khouang to Ji'an generates about 143 kg of CO2 per passenger, and 143 kilograms equals 316 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xieng Khouang to Ji'an

See the map of the shortest flight path between Xieng Khouang Airport (XKH) and Jinggangshan Airport (JGS).

Airport information

Origin Xieng Khouang Airport
City: Xieng Khouang
Country: Laos Flag of Laos
IATA Code: XKH
ICAO Code: VLXK
Coordinates: 19°27′0″N, 103°9′28″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E