How far is Lijiang from Siliguri?
The distance between Siliguri (Bagdogra Airport) and Lijiang (Lijiang Sanyi International Airport) is 737 miles / 1186 kilometers / 640 nautical miles.
The driving distance from Siliguri (IXB) to Lijiang (LJG) is 1264 miles / 2034 kilometers, and travel time by car is about 26 hours 55 minutes.
Bagdogra Airport – Lijiang Sanyi International Airport
Search flights
Distance from Siliguri to Lijiang
There are several ways to calculate the distance from Siliguri to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 736.784 miles
- 1185.739 kilometers
- 640.248 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 735.464 miles
- 1183.614 kilometers
- 639.101 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Siliguri to Lijiang?
The estimated flight time from Bagdogra Airport to Lijiang Sanyi International Airport is 1 hour and 53 minutes.
What is the time difference between Siliguri and Lijiang?
Flight carbon footprint between Bagdogra Airport (IXB) and Lijiang Sanyi International Airport (LJG)
On average, flying from Siliguri to Lijiang generates about 129 kg of CO2 per passenger, and 129 kilograms equals 283 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Siliguri to Lijiang
See the map of the shortest flight path between Bagdogra Airport (IXB) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Bagdogra Airport |
---|---|
City: | Siliguri |
Country: | India |
IATA Code: | IXB |
ICAO Code: | VEBD |
Coordinates: | 26°40′52″N, 88°19′42″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |