How far is Lijiang from Bhairawa?
The distance between Bhairawa (Gautam Buddha Airport) and Lijiang (Lijiang Sanyi International Airport) is 1038 miles / 1670 kilometers / 902 nautical miles.
The driving distance from Bhairawa (BWA) to Lijiang (LJG) is 1641 miles / 2641 kilometers, and travel time by car is about 34 hours 29 minutes.
Gautam Buddha Airport – Lijiang Sanyi International Airport
Search flights
Distance from Bhairawa to Lijiang
There are several ways to calculate the distance from Bhairawa to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 1037.864 miles
- 1670.280 kilometers
- 901.879 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1036.001 miles
- 1667.281 kilometers
- 900.260 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bhairawa to Lijiang?
The estimated flight time from Gautam Buddha Airport to Lijiang Sanyi International Airport is 2 hours and 27 minutes.
What is the time difference between Bhairawa and Lijiang?
Flight carbon footprint between Gautam Buddha Airport (BWA) and Lijiang Sanyi International Airport (LJG)
On average, flying from Bhairawa to Lijiang generates about 153 kg of CO2 per passenger, and 153 kilograms equals 338 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bhairawa to Lijiang
See the map of the shortest flight path between Gautam Buddha Airport (BWA) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Gautam Buddha Airport |
---|---|
City: | Bhairawa |
Country: | Nepal |
IATA Code: | BWA |
ICAO Code: | VNBW |
Coordinates: | 27°30′20″N, 83°24′58″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |