Air Miles Calculator logo

How far is Lijiang from Nan?

The distance between Nan (Nan Nakhon Airport) and Lijiang (Lijiang Sanyi International Airport) is 543 miles / 873 kilometers / 472 nautical miles.

The driving distance from Nan (NNT) to Lijiang (LJG) is 802 miles / 1290 kilometers, and travel time by car is about 17 hours 14 minutes.

Nan Nakhon Airport – Lijiang Sanyi International Airport

Distance arrow
543
Miles
Distance arrow
873
Kilometers
Distance arrow
472
Nautical miles

Search flights

Distance from Nan to Lijiang

There are several ways to calculate the distance from Nan to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 542.717 miles
  • 873.418 kilometers
  • 471.608 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 544.926 miles
  • 876.974 kilometers
  • 473.528 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nan to Lijiang?

The estimated flight time from Nan Nakhon Airport to Lijiang Sanyi International Airport is 1 hour and 31 minutes.

Flight carbon footprint between Nan Nakhon Airport (NNT) and Lijiang Sanyi International Airport (LJG)

On average, flying from Nan to Lijiang generates about 105 kg of CO2 per passenger, and 105 kilograms equals 231 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nan to Lijiang

See the map of the shortest flight path between Nan Nakhon Airport (NNT) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Nan Nakhon Airport
City: Nan
Country: Thailand Flag of Thailand
IATA Code: NNT
ICAO Code: VTCN
Coordinates: 18°48′28″N, 100°46′58″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E