How far is Lijiang from Weihai?
The distance between Weihai (Weihai Dashuibo Airport) and Lijiang (Lijiang Sanyi International Airport) is 1475 miles / 2374 kilometers / 1282 nautical miles.
The driving distance from Weihai (WEH) to Lijiang (LJG) is 1808 miles / 2909 kilometers, and travel time by car is about 32 hours 52 minutes.
Weihai Dashuibo Airport – Lijiang Sanyi International Airport
Search flights
Distance from Weihai to Lijiang
There are several ways to calculate the distance from Weihai to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 1475.365 miles
- 2374.369 kilometers
- 1282.057 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1474.056 miles
- 2372.264 kilometers
- 1280.920 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Weihai to Lijiang?
The estimated flight time from Weihai Dashuibo Airport to Lijiang Sanyi International Airport is 3 hours and 17 minutes.
What is the time difference between Weihai and Lijiang?
Flight carbon footprint between Weihai Dashuibo Airport (WEH) and Lijiang Sanyi International Airport (LJG)
On average, flying from Weihai to Lijiang generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Weihai to Lijiang
See the map of the shortest flight path between Weihai Dashuibo Airport (WEH) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Weihai Dashuibo Airport |
---|---|
City: | Weihai |
Country: | China |
IATA Code: | WEH |
ICAO Code: | ZSWH |
Coordinates: | 37°11′13″N, 122°13′44″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |