How far is Arxan from Hagåtña?
The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Arxan (Arxan Yi'ershi Airport) is 2738 miles / 4407 kilometers / 2379 nautical miles.
Guam Antonio B. Won Pat International Airport – Arxan Yi'ershi Airport
Search flights
Distance from Hagåtña to Arxan
There are several ways to calculate the distance from Hagåtña to Arxan. Here are two standard methods:
Vincenty's formula (applied above)- 2738.093 miles
- 4406.533 kilometers
- 2379.337 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2742.342 miles
- 4413.372 kilometers
- 2383.030 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hagåtña to Arxan?
The estimated flight time from Guam Antonio B. Won Pat International Airport to Arxan Yi'ershi Airport is 5 hours and 41 minutes.
What is the time difference between Hagåtña and Arxan?
The time difference between Hagåtña and Arxan is 2 hours. Arxan is 2 hours behind Hagåtña.
Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Arxan Yi'ershi Airport (YIE)
On average, flying from Hagåtña to Arxan generates about 303 kg of CO2 per passenger, and 303 kilograms equals 668 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hagåtña to Arxan
See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Arxan Yi'ershi Airport (YIE).
Airport information
Origin | Guam Antonio B. Won Pat International Airport |
---|---|
City: | Hagåtña |
Country: | Guam |
IATA Code: | GUM |
ICAO Code: | PGUM |
Coordinates: | 13°29′0″N, 144°47′45″E |
Destination | Arxan Yi'ershi Airport |
---|---|
City: | Arxan |
Country: | China |
IATA Code: | YIE |
ICAO Code: | ZBES |
Coordinates: | 47°18′38″N, 119°54′42″E |