Air Miles Calculator logo

How far is Lijiang from Qui Nhon?

The distance between Qui Nhon (Phu Cat Airport) and Lijiang (Lijiang Sanyi International Airport) is 1044 miles / 1680 kilometers / 907 nautical miles.

The driving distance from Qui Nhon (UIH) to Lijiang (LJG) is 1346 miles / 2166 kilometers, and travel time by car is about 25 hours 37 minutes.

Phu Cat Airport – Lijiang Sanyi International Airport

Distance arrow
1044
Miles
Distance arrow
1680
Kilometers
Distance arrow
907
Nautical miles

Search flights

Distance from Qui Nhon to Lijiang

There are several ways to calculate the distance from Qui Nhon to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1044.065 miles
  • 1680.261 kilometers
  • 907.268 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1046.799 miles
  • 1684.660 kilometers
  • 909.644 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Qui Nhon to Lijiang?

The estimated flight time from Phu Cat Airport to Lijiang Sanyi International Airport is 2 hours and 28 minutes.

Flight carbon footprint between Phu Cat Airport (UIH) and Lijiang Sanyi International Airport (LJG)

On average, flying from Qui Nhon to Lijiang generates about 154 kg of CO2 per passenger, and 154 kilograms equals 339 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qui Nhon to Lijiang

See the map of the shortest flight path between Phu Cat Airport (UIH) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Phu Cat Airport
City: Qui Nhon
Country: Vietnam Flag of Vietnam
IATA Code: UIH
ICAO Code: VVPC
Coordinates: 13°57′17″N, 109°2′31″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E