Air Miles Calculator logo

How far is Albury from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Albury (Albury Airport) is 8294 miles / 13348 kilometers / 7207 nautical miles.

Salt Lake City International Airport – Albury Airport

Distance arrow
8294
Miles
Distance arrow
13348
Kilometers
Distance arrow
7207
Nautical miles
Flight time duration
16 h 12 min
CO2 emission
1 042 kg

Search flights

Distance from Salt Lake City to Albury

There are several ways to calculate the distance from Salt Lake City to Albury. Here are two standard methods:

Vincenty's formula (applied above)
  • 8293.937 miles
  • 13347.798 kilometers
  • 7207.234 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8300.599 miles
  • 13358.520 kilometers
  • 7213.024 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Albury?

The estimated flight time from Salt Lake City International Airport to Albury Airport is 16 hours and 12 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Albury Airport (ABX)

On average, flying from Salt Lake City to Albury generates about 1 042 kg of CO2 per passenger, and 1 042 kilograms equals 2 296 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Albury

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Albury Airport (ABX).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Albury Airport
City: Albury
Country: Australia Flag of Australia
IATA Code: ABX
ICAO Code: YMAY
Coordinates: 36°4′4″S, 146°57′28″E