How far is Lijiang from Charlotte, NC?
The distance between Charlotte (Charlotte Douglas International Airport) and Lijiang (Lijiang Sanyi International Airport) is 8173 miles / 13152 kilometers / 7102 nautical miles.
Charlotte Douglas International Airport – Lijiang Sanyi International Airport
Search flights
Distance from Charlotte to Lijiang
There are several ways to calculate the distance from Charlotte to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 8172.511 miles
- 13152.381 kilometers
- 7101.718 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8159.700 miles
- 13131.764 kilometers
- 7090.585 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Charlotte to Lijiang?
The estimated flight time from Charlotte Douglas International Airport to Lijiang Sanyi International Airport is 15 hours and 58 minutes.
What is the time difference between Charlotte and Lijiang?
Flight carbon footprint between Charlotte Douglas International Airport (CLT) and Lijiang Sanyi International Airport (LJG)
On average, flying from Charlotte to Lijiang generates about 1 024 kg of CO2 per passenger, and 1 024 kilograms equals 2 257 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Charlotte to Lijiang
See the map of the shortest flight path between Charlotte Douglas International Airport (CLT) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Charlotte Douglas International Airport |
---|---|
City: | Charlotte, NC |
Country: | United States |
IATA Code: | CLT |
ICAO Code: | KCLT |
Coordinates: | 35°12′50″N, 80°56′35″W |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |