Air Miles Calculator logo

How far is Olbia from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Olbia (Olbia Costa Smeralda Airport) is 5721 miles / 9208 kilometers / 4972 nautical miles.

Salt Lake City International Airport – Olbia Costa Smeralda Airport

Distance arrow
5721
Miles
Distance arrow
9208
Kilometers
Distance arrow
4972
Nautical miles

Search flights

Distance from Salt Lake City to Olbia

There are several ways to calculate the distance from Salt Lake City to Olbia. Here are two standard methods:

Vincenty's formula (applied above)
  • 5721.498 miles
  • 9207.859 kilometers
  • 4971.846 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5707.391 miles
  • 9185.155 kilometers
  • 4959.587 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Olbia?

The estimated flight time from Salt Lake City International Airport to Olbia Costa Smeralda Airport is 11 hours and 19 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Olbia Costa Smeralda Airport (OLB)

On average, flying from Salt Lake City to Olbia generates about 679 kg of CO2 per passenger, and 679 kilograms equals 1 497 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Olbia

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Olbia Costa Smeralda Airport (OLB).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Olbia Costa Smeralda Airport
City: Olbia
Country: Italy Flag of Italy
IATA Code: OLB
ICAO Code: LIEO
Coordinates: 40°53′55″N, 9°31′3″E