Air Miles Calculator logo

How far is Luang Namtha from Nyaung U?

The distance between Nyaung U (Nyaung U Airport) and Luang Namtha (Louang Namtha Airport) is 418 miles / 673 kilometers / 363 nautical miles.

The driving distance from Nyaung U (NYU) to Luang Namtha (LXG) is 678 miles / 1091 kilometers, and travel time by car is about 14 hours 43 minutes.

Nyaung U Airport – Louang Namtha Airport

Distance arrow
418
Miles
Distance arrow
673
Kilometers
Distance arrow
363
Nautical miles

Search flights

Distance from Nyaung U to Luang Namtha

There are several ways to calculate the distance from Nyaung U to Luang Namtha. Here are two standard methods:

Vincenty's formula (applied above)
  • 417.998 miles
  • 672.703 kilometers
  • 363.231 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 417.353 miles
  • 671.665 kilometers
  • 362.670 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nyaung U to Luang Namtha?

The estimated flight time from Nyaung U Airport to Louang Namtha Airport is 1 hour and 17 minutes.

Flight carbon footprint between Nyaung U Airport (NYU) and Louang Namtha Airport (LXG)

On average, flying from Nyaung U to Luang Namtha generates about 87 kg of CO2 per passenger, and 87 kilograms equals 191 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nyaung U to Luang Namtha

See the map of the shortest flight path between Nyaung U Airport (NYU) and Louang Namtha Airport (LXG).

Airport information

Origin Nyaung U Airport
City: Nyaung U
Country: Burma Flag of Burma
IATA Code: NYU
ICAO Code: VYBG
Coordinates: 21°10′43″N, 94°55′48″E
Destination Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E