Air Miles Calculator logo

How far is Zhuhai from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Zhuhai (Zhuhai Jinwan Airport) is 774 miles / 1246 kilometers / 673 nautical miles.

The driving distance from Luang Namtha (LXG) to Zhuhai (ZUH) is 1071 miles / 1724 kilometers, and travel time by car is about 20 hours 53 minutes.

Louang Namtha Airport – Zhuhai Jinwan Airport

Distance arrow
774
Miles
Distance arrow
1246
Kilometers
Distance arrow
673
Nautical miles

Search flights

Distance from Luang Namtha to Zhuhai

There are several ways to calculate the distance from Luang Namtha to Zhuhai. Here are two standard methods:

Vincenty's formula (applied above)
  • 774.269 miles
  • 1246.065 kilometers
  • 672.821 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 773.095 miles
  • 1244.175 kilometers
  • 671.801 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Zhuhai?

The estimated flight time from Louang Namtha Airport to Zhuhai Jinwan Airport is 1 hour and 57 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Zhuhai Jinwan Airport (ZUH)

On average, flying from Luang Namtha to Zhuhai generates about 132 kg of CO2 per passenger, and 132 kilograms equals 292 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Zhuhai

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Zhuhai Jinwan Airport (ZUH).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Zhuhai Jinwan Airport
City: Zhuhai
Country: China Flag of China
IATA Code: ZUH
ICAO Code: ZGSD
Coordinates: 22°0′23″N, 113°22′33″E