Air Miles Calculator logo

How far is Bangor, ME, from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Bangor (Bangor International Airport) is 7902 miles / 12716 kilometers / 6866 nautical miles.

Guam Antonio B. Won Pat International Airport – Bangor International Airport

Distance arrow
7902
Miles
Distance arrow
12716
Kilometers
Distance arrow
6866
Nautical miles

Search flights

Distance from Hagåtña to Bangor

There are several ways to calculate the distance from Hagåtña to Bangor. Here are two standard methods:

Vincenty's formula (applied above)
  • 7901.594 miles
  • 12716.383 kilometers
  • 6866.298 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7891.566 miles
  • 12700.244 kilometers
  • 6857.583 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Bangor?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Bangor International Airport is 15 hours and 27 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Bangor International Airport (BGR)

On average, flying from Hagåtña to Bangor generates about 984 kg of CO2 per passenger, and 984 kilograms equals 2 169 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Bangor

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Bangor International Airport (BGR).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Bangor International Airport
City: Bangor, ME
Country: United States Flag of United States
IATA Code: BGR
ICAO Code: KBGR
Coordinates: 44°48′26″N, 68°49′41″W