Air Miles Calculator logo

How far is Burqin from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Burqin (Burqin Kanas Airport) is 2042 miles / 3287 kilometers / 1775 nautical miles.

The driving distance from Luang Namtha (LXG) to Burqin (KJI) is 3002 miles / 4831 kilometers, and travel time by car is about 55 hours 20 minutes.

Louang Namtha Airport – Burqin Kanas Airport

Distance arrow
2042
Miles
Distance arrow
3287
Kilometers
Distance arrow
1775
Nautical miles

Search flights

Distance from Luang Namtha to Burqin

There are several ways to calculate the distance from Luang Namtha to Burqin. Here are two standard methods:

Vincenty's formula (applied above)
  • 2042.497 miles
  • 3287.080 kilometers
  • 1774.881 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2045.778 miles
  • 3292.361 kilometers
  • 1777.733 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Burqin?

The estimated flight time from Louang Namtha Airport to Burqin Kanas Airport is 4 hours and 22 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Burqin Kanas Airport (KJI)

On average, flying from Luang Namtha to Burqin generates about 222 kg of CO2 per passenger, and 222 kilograms equals 490 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Burqin

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Burqin Kanas Airport (KJI).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Burqin Kanas Airport
City: Burqin
Country: China Flag of China
IATA Code: KJI
ICAO Code: ZWKN
Coordinates: 48°13′20″N, 86°59′45″E