Air Miles Calculator logo

How far is Lijiang from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Lijiang (Lijiang Sanyi International Airport) is 3017 miles / 4855 kilometers / 2622 nautical miles.

Guam Antonio B. Won Pat International Airport – Lijiang Sanyi International Airport

Distance arrow
3017
Miles
Distance arrow
4855
Kilometers
Distance arrow
2622
Nautical miles

Search flights

Distance from Hagåtña to Lijiang

There are several ways to calculate the distance from Hagåtña to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 3016.975 miles
  • 4855.351 kilometers
  • 2621.680 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3014.013 miles
  • 4850.583 kilometers
  • 2619.105 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Lijiang?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Lijiang Sanyi International Airport is 6 hours and 12 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Lijiang Sanyi International Airport (LJG)

On average, flying from Hagåtña to Lijiang generates about 336 kg of CO2 per passenger, and 336 kilograms equals 741 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Lijiang

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E