Air Miles Calculator logo

How far is Abuja from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Abuja (Nnamdi Azikiwe International Airport) is 7280 miles / 11716 kilometers / 6326 nautical miles.

Salt Lake City International Airport – Nnamdi Azikiwe International Airport

Distance arrow
7280
Miles
Distance arrow
11716
Kilometers
Distance arrow
6326
Nautical miles

Search flights

Distance from Salt Lake City to Abuja

There are several ways to calculate the distance from Salt Lake City to Abuja. Here are two standard methods:

Vincenty's formula (applied above)
  • 7280.217 miles
  • 11716.373 kilometers
  • 6326.335 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7272.022 miles
  • 11703.184 kilometers
  • 6319.214 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Abuja?

The estimated flight time from Salt Lake City International Airport to Nnamdi Azikiwe International Airport is 14 hours and 17 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Nnamdi Azikiwe International Airport (ABV)

On average, flying from Salt Lake City to Abuja generates about 895 kg of CO2 per passenger, and 895 kilograms equals 1 972 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Abuja

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Nnamdi Azikiwe International Airport (ABV).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Nnamdi Azikiwe International Airport
City: Abuja
Country: Nigeria Flag of Nigeria
IATA Code: ABV
ICAO Code: DNAA
Coordinates: 9°0′24″N, 7°15′47″E