Air Miles Calculator logo

How far is Yibin from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Yibin (Yibin Wuliangye Airport) is 577 miles / 929 kilometers / 502 nautical miles.

The driving distance from Luang Namtha (LXG) to Yibin (YBP) is 844 miles / 1358 kilometers, and travel time by car is about 15 hours 36 minutes.

Louang Namtha Airport – Yibin Wuliangye Airport

Distance arrow
577
Miles
Distance arrow
929
Kilometers
Distance arrow
502
Nautical miles

Search flights

Distance from Luang Namtha to Yibin

There are several ways to calculate the distance from Luang Namtha to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 577.395 miles
  • 929.227 kilometers
  • 501.742 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 579.230 miles
  • 932.180 kilometers
  • 503.337 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Yibin?

The estimated flight time from Louang Namtha Airport to Yibin Wuliangye Airport is 1 hour and 35 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Yibin Wuliangye Airport (YBP)

On average, flying from Luang Namtha to Yibin generates about 110 kg of CO2 per passenger, and 110 kilograms equals 242 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Yibin

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Yibin Wuliangye Airport (YBP).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E