How far is Lijiang from Yingkou?
The distance between Yingkou (Yingkou Lanqi Airport) and Lijiang (Lijiang Sanyi International Airport) is 1586 miles / 2552 kilometers / 1378 nautical miles.
The driving distance from Yingkou (YKH) to Lijiang (LJG) is 1988 miles / 3200 kilometers, and travel time by car is about 36 hours 11 minutes.
Yingkou Lanqi Airport – Lijiang Sanyi International Airport
Search flights
Distance from Yingkou to Lijiang
There are several ways to calculate the distance from Yingkou to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 1585.606 miles
- 2551.785 kilometers
- 1377.854 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1584.891 miles
- 2550.634 kilometers
- 1377.232 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Yingkou to Lijiang?
The estimated flight time from Yingkou Lanqi Airport to Lijiang Sanyi International Airport is 3 hours and 30 minutes.
What is the time difference between Yingkou and Lijiang?
Flight carbon footprint between Yingkou Lanqi Airport (YKH) and Lijiang Sanyi International Airport (LJG)
On average, flying from Yingkou to Lijiang generates about 185 kg of CO2 per passenger, and 185 kilograms equals 408 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Yingkou to Lijiang
See the map of the shortest flight path between Yingkou Lanqi Airport (YKH) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Yingkou Lanqi Airport |
---|---|
City: | Yingkou |
Country: | China |
IATA Code: | YKH |
ICAO Code: | ZYYK |
Coordinates: | 40°32′33″N, 122°21′30″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |