Air Miles Calculator logo

How far is Gold Coast from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Gold Coast (Gold Coast Airport) is 7674 miles / 12351 kilometers / 6669 nautical miles.

Salt Lake City International Airport – Gold Coast Airport

Distance arrow
7674
Miles
Distance arrow
12351
Kilometers
Distance arrow
6669
Nautical miles

Search flights

Distance from Salt Lake City to Gold Coast

There are several ways to calculate the distance from Salt Lake City to Gold Coast. Here are two standard methods:

Vincenty's formula (applied above)
  • 7674.298 miles
  • 12350.585 kilometers
  • 6668.782 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7680.161 miles
  • 12360.021 kilometers
  • 6673.878 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Gold Coast?

The estimated flight time from Salt Lake City International Airport to Gold Coast Airport is 15 hours and 1 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Gold Coast Airport (OOL)

On average, flying from Salt Lake City to Gold Coast generates about 951 kg of CO2 per passenger, and 951 kilograms equals 2 097 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Gold Coast

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Gold Coast Airport (OOL).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Gold Coast Airport
City: Gold Coast
Country: Australia Flag of Australia
IATA Code: OOL
ICAO Code: YBCG
Coordinates: 28°9′51″S, 153°30′18″E