How far is Lijiang from Fuyuan?
The distance between Fuyuan (Fuyuan Dongji Airport) and Lijiang (Lijiang Sanyi International Airport) is 2360 miles / 3798 kilometers / 2051 nautical miles.
The driving distance from Fuyuan (FYJ) to Lijiang (LJG) is 2828 miles / 4552 kilometers, and travel time by car is about 51 hours 15 minutes.
Fuyuan Dongji Airport – Lijiang Sanyi International Airport
Search flights
Distance from Fuyuan to Lijiang
There are several ways to calculate the distance from Fuyuan to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 2359.769 miles
- 3797.681 kilometers
- 2050.584 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2358.176 miles
- 3795.116 kilometers
- 2049.199 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Fuyuan to Lijiang?
The estimated flight time from Fuyuan Dongji Airport to Lijiang Sanyi International Airport is 4 hours and 58 minutes.
What is the time difference between Fuyuan and Lijiang?
Flight carbon footprint between Fuyuan Dongji Airport (FYJ) and Lijiang Sanyi International Airport (LJG)
On average, flying from Fuyuan to Lijiang generates about 259 kg of CO2 per passenger, and 259 kilograms equals 571 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Fuyuan to Lijiang
See the map of the shortest flight path between Fuyuan Dongji Airport (FYJ) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Fuyuan Dongji Airport |
---|---|
City: | Fuyuan |
Country: | China |
IATA Code: | FYJ |
ICAO Code: | ZYFY |
Coordinates: | 48°11′58″N, 134°21′59″E |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |