Air Miles Calculator logo

How far is Zhanjiang from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Zhanjiang (Zhanjiang Airport) is 579 miles / 931 kilometers / 503 nautical miles.

The driving distance from Luang Namtha (LXG) to Zhanjiang (ZHA) is 839 miles / 1351 kilometers, and travel time by car is about 16 hours 50 minutes.

Louang Namtha Airport – Zhanjiang Airport

Distance arrow
579
Miles
Distance arrow
931
Kilometers
Distance arrow
503
Nautical miles

Search flights

Distance from Luang Namtha to Zhanjiang

There are several ways to calculate the distance from Luang Namtha to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 578.548 miles
  • 931.082 kilometers
  • 502.744 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 577.653 miles
  • 929.643 kilometers
  • 501.967 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Zhanjiang?

The estimated flight time from Louang Namtha Airport to Zhanjiang Airport is 1 hour and 35 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Zhanjiang Airport (ZHA)

On average, flying from Luang Namtha to Zhanjiang generates about 110 kg of CO2 per passenger, and 110 kilograms equals 242 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Zhanjiang

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Zhanjiang Airport (ZHA).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E