Air Miles Calculator logo

How far is Whyalla from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Whyalla (Whyalla Airport) is 3236 miles / 5208 kilometers / 2812 nautical miles.

Guam Antonio B. Won Pat International Airport – Whyalla Airport

Distance arrow
3236
Miles
Distance arrow
5208
Kilometers
Distance arrow
2812
Nautical miles
Flight time duration
6 h 37 min
CO2 emission
363 kg

Search flights

Distance from Hagåtña to Whyalla

There are several ways to calculate the distance from Hagåtña to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 3236.321 miles
  • 5208.353 kilometers
  • 2812.286 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3251.465 miles
  • 5232.726 kilometers
  • 2825.446 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Whyalla?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Whyalla Airport is 6 hours and 37 minutes.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Whyalla Airport (WYA)

On average, flying from Hagåtña to Whyalla generates about 363 kg of CO2 per passenger, and 363 kilograms equals 799 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Whyalla

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Whyalla Airport (WYA).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E