Air Miles Calculator logo

How far is Hervey Bay from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Hervey Bay (Hervey Bay Airport) is 7590 miles / 12215 kilometers / 6595 nautical miles.

Salt Lake City International Airport – Hervey Bay Airport

Distance arrow
7590
Miles
Distance arrow
12215
Kilometers
Distance arrow
6595
Nautical miles

Search flights

Distance from Salt Lake City to Hervey Bay

There are several ways to calculate the distance from Salt Lake City to Hervey Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 7589.818 miles
  • 12214.628 kilometers
  • 6595.372 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7594.769 miles
  • 12222.596 kilometers
  • 6599.674 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Hervey Bay?

The estimated flight time from Salt Lake City International Airport to Hervey Bay Airport is 14 hours and 52 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Hervey Bay Airport (HVB)

On average, flying from Salt Lake City to Hervey Bay generates about 939 kg of CO2 per passenger, and 939 kilograms equals 2 070 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Hervey Bay

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Hervey Bay Airport (HVB).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Hervey Bay Airport
City: Hervey Bay
Country: Australia Flag of Australia
IATA Code: HVB
ICAO Code: YHBA
Coordinates: 25°19′8″S, 152°52′48″E