Air Miles Calculator logo

How far is Lijiang from Shache?

The distance between Shache (Shache Airport) and Lijiang (Lijiang Sanyi International Airport) is 1565 miles / 2519 kilometers / 1360 nautical miles.

The driving distance from Shache (QSZ) to Lijiang (LJG) is 2379 miles / 3828 kilometers, and travel time by car is about 47 hours 21 minutes.

Shache Airport – Lijiang Sanyi International Airport

Distance arrow
1565
Miles
Distance arrow
2519
Kilometers
Distance arrow
1360
Nautical miles

Search flights

Distance from Shache to Lijiang

There are several ways to calculate the distance from Shache to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1565.073 miles
  • 2518.740 kilometers
  • 1360.011 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1563.768 miles
  • 2516.641 kilometers
  • 1358.877 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shache to Lijiang?

The estimated flight time from Shache Airport to Lijiang Sanyi International Airport is 3 hours and 27 minutes.

What is the time difference between Shache and Lijiang?

There is no time difference between Shache and Lijiang.

Flight carbon footprint between Shache Airport (QSZ) and Lijiang Sanyi International Airport (LJG)

On average, flying from Shache to Lijiang generates about 184 kg of CO2 per passenger, and 184 kilograms equals 405 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shache to Lijiang

See the map of the shortest flight path between Shache Airport (QSZ) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Shache Airport
City: Shache
Country: China Flag of China
IATA Code: QSZ
ICAO Code: ZWSC
Coordinates: 38°16′51″N, 77°4′30″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E