Air Miles Calculator logo

How far is Qianjiang from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Qianjiang (Qianjiang Wulingshan Airport) is 2550 miles / 4104 kilometers / 2216 nautical miles.

Guam Antonio B. Won Pat International Airport – Qianjiang Wulingshan Airport

Distance arrow
2550
Miles
Distance arrow
4104
Kilometers
Distance arrow
2216
Nautical miles

Search flights

Distance from Hagåtña to Qianjiang

There are several ways to calculate the distance from Hagåtña to Qianjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2550.026 miles
  • 4103.870 kilometers
  • 2215.912 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2548.774 miles
  • 4101.855 kilometers
  • 2214.824 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Qianjiang?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Qianjiang Wulingshan Airport is 5 hours and 19 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Qianjiang Wulingshan Airport (JIQ)

On average, flying from Hagåtña to Qianjiang generates about 281 kg of CO2 per passenger, and 281 kilograms equals 620 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Qianjiang

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Qianjiang Wulingshan Airport (JIQ).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Qianjiang Wulingshan Airport
City: Qianjiang
Country: China Flag of China
IATA Code: JIQ
ICAO Code: ZUQJ
Coordinates: 29°30′47″N, 108°49′51″E