How far is Lijiang from Saga?
The distance between Saga (Saga Airport) and Lijiang (Lijiang Sanyi International Airport) is 1851 miles / 2979 kilometers / 1608 nautical miles.
The driving distance from Saga (HSG) to Lijiang (LJG) is 2825 miles / 4547 kilometers, and travel time by car is about 55 hours 32 minutes.
Saga Airport – Lijiang Sanyi International Airport
Search flights
Distance from Saga to Lijiang
There are several ways to calculate the distance from Saga to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 1850.815 miles
- 2978.599 kilometers
- 1608.315 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1847.751 miles
- 2973.667 kilometers
- 1605.652 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Saga to Lijiang?
The estimated flight time from Saga Airport to Lijiang Sanyi International Airport is 4 hours and 0 minutes.
What is the time difference between Saga and Lijiang?
The time difference between Saga and Lijiang is 1 hour. Lijiang is 1 hour behind Saga.
Flight carbon footprint between Saga Airport (HSG) and Lijiang Sanyi International Airport (LJG)
On average, flying from Saga to Lijiang generates about 204 kg of CO2 per passenger, and 204 kilograms equals 450 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Saga to Lijiang
See the map of the shortest flight path between Saga Airport (HSG) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Saga Airport |
---|---|
City: | Saga |
Country: | Japan |
IATA Code: | HSG |
ICAO Code: | RJFS |
Coordinates: | 33°8′58″N, 130°18′7″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |