How far is Lijiang from Hongping?
The distance between Hongping (Shennongjia Hongping Airport) and Lijiang (Lijiang Sanyi International Airport) is 698 miles / 1124 kilometers / 607 nautical miles.
The driving distance from Hongping (HPG) to Lijiang (LJG) is 1031 miles / 1659 kilometers, and travel time by car is about 19 hours 17 minutes.
Shennongjia Hongping Airport – Lijiang Sanyi International Airport
Search flights
Distance from Hongping to Lijiang
There are several ways to calculate the distance from Hongping to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 698.437 miles
- 1124.026 kilometers
- 606.925 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 697.956 miles
- 1123.251 kilometers
- 606.507 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hongping to Lijiang?
The estimated flight time from Shennongjia Hongping Airport to Lijiang Sanyi International Airport is 1 hour and 49 minutes.
What is the time difference between Hongping and Lijiang?
Flight carbon footprint between Shennongjia Hongping Airport (HPG) and Lijiang Sanyi International Airport (LJG)
On average, flying from Hongping to Lijiang generates about 124 kg of CO2 per passenger, and 124 kilograms equals 274 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Hongping to Lijiang
See the map of the shortest flight path between Shennongjia Hongping Airport (HPG) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Shennongjia Hongping Airport |
---|---|
City: | Hongping |
Country: | China |
IATA Code: | HPG |
ICAO Code: | ZHSN |
Coordinates: | 31°37′33″N, 110°20′24″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |