Air Miles Calculator logo

How far is Lijiang from Haiphong?

The distance between Haiphong (Cat Bi International Airport) and Lijiang (Lijiang Sanyi International Airport) is 575 miles / 926 kilometers / 500 nautical miles.

The driving distance from Haiphong (HPH) to Lijiang (LJG) is 774 miles / 1246 kilometers, and travel time by car is about 14 hours 16 minutes.

Cat Bi International Airport – Lijiang Sanyi International Airport

Distance arrow
575
Miles
Distance arrow
926
Kilometers
Distance arrow
500
Nautical miles

Search flights

Distance from Haiphong to Lijiang

There are several ways to calculate the distance from Haiphong to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 575.174 miles
  • 925.654 kilometers
  • 499.813 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 575.814 miles
  • 926.682 kilometers
  • 500.369 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Haiphong to Lijiang?

The estimated flight time from Cat Bi International Airport to Lijiang Sanyi International Airport is 1 hour and 35 minutes.

Flight carbon footprint between Cat Bi International Airport (HPH) and Lijiang Sanyi International Airport (LJG)

On average, flying from Haiphong to Lijiang generates about 109 kg of CO2 per passenger, and 109 kilograms equals 241 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Haiphong to Lijiang

See the map of the shortest flight path between Cat Bi International Airport (HPH) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Cat Bi International Airport
City: Haiphong
Country: Vietnam Flag of Vietnam
IATA Code: HPH
ICAO Code: VVCI
Coordinates: 20°49′9″N, 106°43′29″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E