How far is Luang Namtha from Changzhi?
The distance between Changzhi (Changzhi Wangcun Airport) and Luang Namtha (Louang Namtha Airport) is 1269 miles / 2041 kilometers / 1102 nautical miles.
The driving distance from Changzhi (CIH) to Luang Namtha (LXG) is 1688 miles / 2716 kilometers, and travel time by car is about 30 hours 57 minutes.
Changzhi Wangcun Airport – Louang Namtha Airport
Search flights
Distance from Changzhi to Luang Namtha
There are several ways to calculate the distance from Changzhi to Luang Namtha. Here are two standard methods:
Vincenty's formula (applied above)- 1268.518 miles
- 2041.481 kilometers
- 1102.312 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1270.628 miles
- 2044.878 kilometers
- 1104.146 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Changzhi to Luang Namtha?
The estimated flight time from Changzhi Wangcun Airport to Louang Namtha Airport is 2 hours and 54 minutes.
What is the time difference between Changzhi and Luang Namtha?
Flight carbon footprint between Changzhi Wangcun Airport (CIH) and Louang Namtha Airport (LXG)
On average, flying from Changzhi to Luang Namtha generates about 165 kg of CO2 per passenger, and 165 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Changzhi to Luang Namtha
See the map of the shortest flight path between Changzhi Wangcun Airport (CIH) and Louang Namtha Airport (LXG).
Airport information
Origin | Changzhi Wangcun Airport |
---|---|
City: | Changzhi |
Country: | China |
IATA Code: | CIH |
ICAO Code: | ZBCZ |
Coordinates: | 36°14′51″N, 113°7′33″E |
Destination | Louang Namtha Airport |
---|---|
City: | Luang Namtha |
Country: | Laos |
IATA Code: | LXG |
ICAO Code: | VLLN |
Coordinates: | 20°58′1″N, 101°24′0″E |