Air Miles Calculator logo

How far is Uranium City from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Uranium City (Uranium City Airport) is 5990 miles / 9639 kilometers / 5205 nautical miles.

Guam Antonio B. Won Pat International Airport – Uranium City Airport

Distance arrow
5990
Miles
Distance arrow
9639
Kilometers
Distance arrow
5205
Nautical miles

Search flights

Distance from Hagåtña to Uranium City

There are several ways to calculate the distance from Hagåtña to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 5989.622 miles
  • 9639.363 kilometers
  • 5204.840 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5983.636 miles
  • 9629.729 kilometers
  • 5199.638 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Uranium City?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Uranium City Airport is 11 hours and 50 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Uranium City Airport (YBE)

On average, flying from Hagåtña to Uranium City generates about 715 kg of CO2 per passenger, and 715 kilograms equals 1 577 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Uranium City

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Uranium City Airport (YBE).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W