Air Miles Calculator logo

How far is Windsor from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Windsor (Windsor International Airport) is 7547 miles / 12145 kilometers / 6558 nautical miles.

Guam Antonio B. Won Pat International Airport – Windsor International Airport

Distance arrow
7547
Miles
Distance arrow
12145
Kilometers
Distance arrow
6558
Nautical miles

Search flights

Distance from Hagåtña to Windsor

There are several ways to calculate the distance from Hagåtña to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 7546.756 miles
  • 12145.327 kilometers
  • 6557.952 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7536.846 miles
  • 12129.377 kilometers
  • 6549.340 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Windsor?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Windsor International Airport is 14 hours and 47 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Windsor International Airport (YQG)

On average, flying from Hagåtña to Windsor generates about 933 kg of CO2 per passenger, and 933 kilograms equals 2 056 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Windsor

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Windsor International Airport (YQG).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W