Air Miles Calculator logo

How far is Lijiang from Phoenix, AZ?

The distance between Phoenix (Phoenix Sky Harbor International Airport) and Lijiang (Lijiang Sanyi International Airport) is 7788 miles / 12534 kilometers / 6768 nautical miles.

Phoenix Sky Harbor International Airport – Lijiang Sanyi International Airport

Distance arrow
7788
Miles
Distance arrow
12534
Kilometers
Distance arrow
6768
Nautical miles

Search flights

Distance from Phoenix to Lijiang

There are several ways to calculate the distance from Phoenix to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 7788.110 miles
  • 12533.748 kilometers
  • 6767.682 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7775.195 miles
  • 12512.963 kilometers
  • 6756.459 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phoenix to Lijiang?

The estimated flight time from Phoenix Sky Harbor International Airport to Lijiang Sanyi International Airport is 15 hours and 14 minutes.

Flight carbon footprint between Phoenix Sky Harbor International Airport (PHX) and Lijiang Sanyi International Airport (LJG)

On average, flying from Phoenix to Lijiang generates about 968 kg of CO2 per passenger, and 968 kilograms equals 2 133 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Phoenix to Lijiang

See the map of the shortest flight path between Phoenix Sky Harbor International Airport (PHX) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Phoenix Sky Harbor International Airport
City: Phoenix, AZ
Country: United States Flag of United States
IATA Code: PHX
ICAO Code: KPHX
Coordinates: 33°26′3″N, 112°0′43″W
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E