How far is Lijiang from Shiyan?
The distance between Shiyan (Shiyan Wudangshan Airport) and Lijiang (Lijiang Sanyi International Airport) is 759 miles / 1222 kilometers / 660 nautical miles.
The driving distance from Shiyan (WDS) to Lijiang (LJG) is 1017 miles / 1636 kilometers, and travel time by car is about 18 hours 41 minutes.
Shiyan Wudangshan Airport – Lijiang Sanyi International Airport
Search flights
Distance from Shiyan to Lijiang
There are several ways to calculate the distance from Shiyan to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 759.218 miles
- 1221.843 kilometers
- 659.743 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 758.860 miles
- 1221.267 kilometers
- 659.431 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shiyan to Lijiang?
The estimated flight time from Shiyan Wudangshan Airport to Lijiang Sanyi International Airport is 1 hour and 56 minutes.
What is the time difference between Shiyan and Lijiang?
Flight carbon footprint between Shiyan Wudangshan Airport (WDS) and Lijiang Sanyi International Airport (LJG)
On average, flying from Shiyan to Lijiang generates about 131 kg of CO2 per passenger, and 131 kilograms equals 288 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Shiyan to Lijiang
See the map of the shortest flight path between Shiyan Wudangshan Airport (WDS) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Shiyan Wudangshan Airport |
---|---|
City: | Shiyan |
Country: | China |
IATA Code: | WDS |
ICAO Code: | ZHSY |
Coordinates: | 32°35′30″N, 110°54′28″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |