How far is Lijiang from Lüliang?
The distance between Lüliang (Lüliang Dawu Airport) and Lijiang (Lijiang Sanyi International Airport) is 990 miles / 1593 kilometers / 860 nautical miles.
The driving distance from Lüliang (LLV) to Lijiang (LJG) is 1282 miles / 2063 kilometers, and travel time by car is about 23 hours 38 minutes.
Lüliang Dawu Airport – Lijiang Sanyi International Airport
Search flights
Distance from Lüliang to Lijiang
There are several ways to calculate the distance from Lüliang to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 989.738 miles
- 1592.829 kilometers
- 860.059 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 990.487 miles
- 1594.034 kilometers
- 860.709 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Lüliang to Lijiang?
The estimated flight time from Lüliang Dawu Airport to Lijiang Sanyi International Airport is 2 hours and 22 minutes.
What is the time difference between Lüliang and Lijiang?
Flight carbon footprint between Lüliang Dawu Airport (LLV) and Lijiang Sanyi International Airport (LJG)
On average, flying from Lüliang to Lijiang generates about 150 kg of CO2 per passenger, and 150 kilograms equals 331 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Lüliang to Lijiang
See the map of the shortest flight path between Lüliang Dawu Airport (LLV) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Lüliang Dawu Airport |
---|---|
City: | Lüliang |
Country: | China |
IATA Code: | LLV |
ICAO Code: | ZBLL |
Coordinates: | 37°40′59″N, 111°8′34″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |