Air Miles Calculator logo

How far is Beijing from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Beijing (Beijing Daxing International Airport) is 1556 miles / 2505 kilometers / 1352 nautical miles.

The driving distance from Luang Namtha (LXG) to Beijing (PKX) is 2024 miles / 3258 kilometers, and travel time by car is about 36 hours 56 minutes.

Louang Namtha Airport – Beijing Daxing International Airport

Distance arrow
1556
Miles
Distance arrow
2505
Kilometers
Distance arrow
1352
Nautical miles

Search flights

Distance from Luang Namtha to Beijing

There are several ways to calculate the distance from Luang Namtha to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1556.287 miles
  • 2504.601 kilometers
  • 1352.376 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1558.449 miles
  • 2508.081 kilometers
  • 1354.255 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Beijing?

The estimated flight time from Louang Namtha Airport to Beijing Daxing International Airport is 3 hours and 26 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Beijing Daxing International Airport (PKX)

On average, flying from Luang Namtha to Beijing generates about 183 kg of CO2 per passenger, and 183 kilograms equals 404 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Beijing

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Beijing Daxing International Airport (PKX).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E