Air Miles Calculator logo

How far is Huaihua from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Huaihua (Huaihua Zhijiang Airport) is 7032 miles / 11317 kilometers / 6111 nautical miles.

Salt Lake City International Airport – Huaihua Zhijiang Airport

Distance arrow
7032
Miles
Distance arrow
11317
Kilometers
Distance arrow
6111
Nautical miles

Search flights

Distance from Salt Lake City to Huaihua

There are several ways to calculate the distance from Salt Lake City to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 7032.219 miles
  • 11317.260 kilometers
  • 6110.831 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7018.839 miles
  • 11295.726 kilometers
  • 6099.204 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Huaihua?

The estimated flight time from Salt Lake City International Airport to Huaihua Zhijiang Airport is 13 hours and 48 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Salt Lake City to Huaihua generates about 859 kg of CO2 per passenger, and 859 kilograms equals 1 895 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Huaihua

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E