Air Miles Calculator logo

How far is Lijiang from Seattle, WA?

The distance between Seattle (Seattle–Tacoma International Airport) and Lijiang (Lijiang Sanyi International Airport) is 6685 miles / 10759 kilometers / 5809 nautical miles.

Seattle–Tacoma International Airport – Lijiang Sanyi International Airport

Distance arrow
6685
Miles
Distance arrow
10759
Kilometers
Distance arrow
5809
Nautical miles

Search flights

Distance from Seattle to Lijiang

There are several ways to calculate the distance from Seattle to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 6685.207 miles
  • 10758.798 kilometers
  • 5809.286 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6672.066 miles
  • 10737.650 kilometers
  • 5797.867 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Seattle to Lijiang?

The estimated flight time from Seattle–Tacoma International Airport to Lijiang Sanyi International Airport is 13 hours and 9 minutes.

Flight carbon footprint between Seattle–Tacoma International Airport (SEA) and Lijiang Sanyi International Airport (LJG)

On average, flying from Seattle to Lijiang generates about 811 kg of CO2 per passenger, and 811 kilograms equals 1 787 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Seattle to Lijiang

See the map of the shortest flight path between Seattle–Tacoma International Airport (SEA) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Seattle–Tacoma International Airport
City: Seattle, WA
Country: United States Flag of United States
IATA Code: SEA
ICAO Code: KSEA
Coordinates: 47°26′56″N, 122°18′32″W
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E