Air Miles Calculator logo

How far is Jinzhou from Luang Namtha?

The distance between Luang Namtha (Louang Namtha Airport) and Jinzhou (Jinzhou Bay Airport) is 1803 miles / 2902 kilometers / 1567 nautical miles.

The driving distance from Luang Namtha (LXG) to Jinzhou (JNZ) is 2318 miles / 3730 kilometers, and travel time by car is about 42 hours 4 minutes.

Louang Namtha Airport – Jinzhou Bay Airport

Distance arrow
1803
Miles
Distance arrow
2902
Kilometers
Distance arrow
1567
Nautical miles

Search flights

Distance from Luang Namtha to Jinzhou

There are several ways to calculate the distance from Luang Namtha to Jinzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 1803.235 miles
  • 2902.025 kilometers
  • 1566.968 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1804.841 miles
  • 2904.609 kilometers
  • 1568.364 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luang Namtha to Jinzhou?

The estimated flight time from Louang Namtha Airport to Jinzhou Bay Airport is 3 hours and 54 minutes.

Flight carbon footprint between Louang Namtha Airport (LXG) and Jinzhou Bay Airport (JNZ)

On average, flying from Luang Namtha to Jinzhou generates about 200 kg of CO2 per passenger, and 200 kilograms equals 442 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luang Namtha to Jinzhou

See the map of the shortest flight path between Louang Namtha Airport (LXG) and Jinzhou Bay Airport (JNZ).

Airport information

Origin Louang Namtha Airport
City: Luang Namtha
Country: Laos Flag of Laos
IATA Code: LXG
ICAO Code: VLLN
Coordinates: 20°58′1″N, 101°24′0″E
Destination Jinzhou Bay Airport
City: Jinzhou
Country: China Flag of China
IATA Code: JNZ
ICAO Code: ZYJZ
Coordinates: 41°6′5″N, 121°3′43″E