Air Miles Calculator logo

How far is Jining from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Jining (Jining Qufu Airport) is 1340 miles / 2157 kilometers / 1165 nautical miles.

The driving distance from Luang Namtha (LXG) to Jining (JNG) is 1796 miles / 2890 kilometers, and travel time by car is about 32 hours 44 minutes.

Louang Namtha Airport – Jining Qufu Airport

Distance arrow
1340
Miles
Distance arrow
2157
Kilometers
Distance arrow
1165
Nautical miles

Search flights

Distance from Luang Namtha to Jining

There are several ways to calculate the distance from Luang Namtha to Jining. Here are two standard methods:

Vincenty's formula (applied above)
  • 1340.378 miles
  • 2157.129 kilometers
  • 1164.756 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1341.664 miles
  • 2159.198 kilometers
  • 1165.874 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Jining?

The estimated flight time from Louang Namtha Airport to Jining Qufu Airport is 3 hours and 2 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Jining Qufu Airport (JNG)

On average, flying from Luang Namtha to Jining generates about 169 kg of CO2 per passenger, and 169 kilograms equals 374 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Jining

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Jining Qufu Airport (JNG).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Jining Qufu Airport
City: Jining
Country: China Flag of China
IATA Code: JNG
ICAO Code: ZSJG
Coordinates: 35°17′34″N, 116°20′48″E