Air Miles Calculator logo

How far is Baise from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Baise (Baise Bama Airport) is 403 miles / 649 kilometers / 350 nautical miles.

The driving distance from Luang Namtha (LXG) to Baise (AEB) is 756 miles / 1216 kilometers, and travel time by car is about 14 hours 9 minutes.

Louang Namtha Airport – Baise Bama Airport

Distance arrow
403
Miles
Distance arrow
649
Kilometers
Distance arrow
350
Nautical miles

Search flights

Distance from Luang Namtha to Baise

There are several ways to calculate the distance from Luang Namtha to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 403.116 miles
  • 648.752 kilometers
  • 350.298 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 402.984 miles
  • 648.540 kilometers
  • 350.183 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Baise?

The estimated flight time from Louang Namtha Airport to Baise Bama Airport is 1 hour and 15 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Baise Bama Airport (AEB)

On average, flying from Luang Namtha to Baise generates about 84 kg of CO2 per passenger, and 84 kilograms equals 186 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Baise

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Baise Bama Airport (AEB).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E