Air Miles Calculator logo

How far is Shihezi from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Shihezi (Shihezi Huayuan Airport) is 6474 miles / 10419 kilometers / 5626 nautical miles.

Salt Lake City International Airport – Shihezi Huayuan Airport

Distance arrow
6474
Miles
Distance arrow
10419
Kilometers
Distance arrow
5626
Nautical miles

Search flights

Distance from Salt Lake City to Shihezi

There are several ways to calculate the distance from Salt Lake City to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 6473.902 miles
  • 10418.735 kilometers
  • 5625.667 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6457.907 miles
  • 10392.994 kilometers
  • 5611.768 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Shihezi?

The estimated flight time from Salt Lake City International Airport to Shihezi Huayuan Airport is 12 hours and 45 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Shihezi Huayuan Airport (SHF)

On average, flying from Salt Lake City to Shihezi generates about 781 kg of CO2 per passenger, and 781 kilograms equals 1 723 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Shihezi

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E