How far is Lijiang from Beihai?
The distance between Beihai (Beihai Fucheng Airport) and Lijiang (Lijiang Sanyi International Airport) is 672 miles / 1081 kilometers / 584 nautical miles.
The driving distance from Beihai (BHY) to Lijiang (LJG) is 895 miles / 1440 kilometers, and travel time by car is about 16 hours 10 minutes.
Beihai Fucheng Airport – Lijiang Sanyi International Airport
Search flights
Distance from Beihai to Lijiang
There are several ways to calculate the distance from Beihai to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 671.850 miles
- 1081.237 kilometers
- 583.821 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 671.769 miles
- 1081.107 kilometers
- 583.751 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beihai to Lijiang?
The estimated flight time from Beihai Fucheng Airport to Lijiang Sanyi International Airport is 1 hour and 46 minutes.
What is the time difference between Beihai and Lijiang?
Flight carbon footprint between Beihai Fucheng Airport (BHY) and Lijiang Sanyi International Airport (LJG)
On average, flying from Beihai to Lijiang generates about 121 kg of CO2 per passenger, and 121 kilograms equals 268 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Beihai to Lijiang
See the map of the shortest flight path between Beihai Fucheng Airport (BHY) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Beihai Fucheng Airport |
---|---|
City: | Beihai |
Country: | China |
IATA Code: | BHY |
ICAO Code: | ZGBH |
Coordinates: | 21°32′21″N, 109°17′38″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |