Air Miles Calculator logo

How far is Lijiang from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Lijiang (Lijiang Sanyi International Airport) is 7589 miles / 12214 kilometers / 6595 nautical miles.

Toronto Pearson International Airport – Lijiang Sanyi International Airport

Distance arrow
7589
Miles
Distance arrow
12214
Kilometers
Distance arrow
6595
Nautical miles

Search flights

Distance from Toronto to Lijiang

There are several ways to calculate the distance from Toronto to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 7589.355 miles
  • 12213.883 kilometers
  • 6594.969 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7575.647 miles
  • 12191.822 kilometers
  • 6583.057 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Lijiang?

The estimated flight time from Toronto Pearson International Airport to Lijiang Sanyi International Airport is 14 hours and 52 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Lijiang Sanyi International Airport (LJG)

On average, flying from Toronto to Lijiang generates about 939 kg of CO2 per passenger, and 939 kilograms equals 2 070 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Lijiang

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E