Air Miles Calculator logo

How far is Yulin from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Yulin (Yulin Yuyang Airport) is 1291 miles / 2078 kilometers / 1122 nautical miles.

The driving distance from Luang Namtha (LXG) to Yulin (UYN) is 1721 miles / 2769 kilometers, and travel time by car is about 31 hours 25 minutes.

Louang Namtha Airport – Yulin Yuyang Airport

Distance arrow
1291
Miles
Distance arrow
2078
Kilometers
Distance arrow
1122
Nautical miles

Search flights

Distance from Luang Namtha to Yulin

There are several ways to calculate the distance from Luang Namtha to Yulin. Here are two standard methods:

Vincenty's formula (applied above)
  • 1291.428 miles
  • 2078.352 kilometers
  • 1122.221 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1294.479 miles
  • 2083.262 kilometers
  • 1124.872 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Yulin?

The estimated flight time from Louang Namtha Airport to Yulin Yuyang Airport is 2 hours and 56 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Yulin Yuyang Airport (UYN)

On average, flying from Luang Namtha to Yulin generates about 166 kg of CO2 per passenger, and 166 kilograms equals 367 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Yulin

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Yulin Yuyang Airport (UYN).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Yulin Yuyang Airport
City: Yulin
Country: China Flag of China
IATA Code: UYN
ICAO Code: ZLYL
Coordinates: 38°16′9″N, 109°43′51″E