Air Miles Calculator logo

How far is Ji'an from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Ji'an (Jinggangshan Airport) is 935 miles / 1505 kilometers / 813 nautical miles.

The driving distance from Luang Namtha (LXG) to Ji'an (JGS) is 1388 miles / 2234 kilometers, and travel time by car is about 25 hours 32 minutes.

Louang Namtha Airport – Jinggangshan Airport

Distance arrow
935
Miles
Distance arrow
1505
Kilometers
Distance arrow
813
Nautical miles

Search flights

Distance from Luang Namtha to Ji'an

There are several ways to calculate the distance from Luang Namtha to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 935.309 miles
  • 1505.234 kilometers
  • 812.761 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 934.738 miles
  • 1504.315 kilometers
  • 812.265 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Ji'an?

The estimated flight time from Louang Namtha Airport to Jinggangshan Airport is 2 hours and 16 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Jinggangshan Airport (JGS)

On average, flying from Luang Namtha to Ji'an generates about 146 kg of CO2 per passenger, and 146 kilograms equals 322 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Ji'an

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Jinggangshan Airport (JGS).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E