Air Miles Calculator logo

How far is Beihai from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Beihai (Beihai Fucheng Airport) is 511 miles / 822 kilometers / 444 nautical miles.

The driving distance from Luang Namtha (LXG) to Beihai (BHY) is 774 miles / 1245 kilometers, and travel time by car is about 15 hours 37 minutes.

Louang Namtha Airport – Beihai Fucheng Airport

Distance arrow
511
Miles
Distance arrow
822
Kilometers
Distance arrow
444
Nautical miles

Search flights

Distance from Luang Namtha to Beihai

There are several ways to calculate the distance from Luang Namtha to Beihai. Here are two standard methods:

Vincenty's formula (applied above)
  • 510.585 miles
  • 821.707 kilometers
  • 443.686 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 509.808 miles
  • 820.457 kilometers
  • 443.011 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Beihai?

The estimated flight time from Louang Namtha Airport to Beihai Fucheng Airport is 1 hour and 28 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Beihai Fucheng Airport (BHY)

On average, flying from Luang Namtha to Beihai generates about 100 kg of CO2 per passenger, and 100 kilograms equals 221 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Beihai

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Beihai Fucheng Airport (BHY).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Beihai Fucheng Airport
City: Beihai
Country: China Flag of China
IATA Code: BHY
ICAO Code: ZGBH
Coordinates: 21°32′21″N, 109°17′38″E