Air Miles Calculator logo

How far is Huangyan from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Huangyan (Taizhou Luqiao Airport) is 6565 miles / 10565 kilometers / 5705 nautical miles.

Salt Lake City International Airport – Taizhou Luqiao Airport

Distance arrow
6565
Miles
Distance arrow
10565
Kilometers
Distance arrow
5705
Nautical miles

Search flights

Distance from Salt Lake City to Huangyan

There are several ways to calculate the distance from Salt Lake City to Huangyan. Here are two standard methods:

Vincenty's formula (applied above)
  • 6564.737 miles
  • 10564.919 kilometers
  • 5704.600 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6551.652 miles
  • 10543.861 kilometers
  • 5693.230 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Huangyan?

The estimated flight time from Salt Lake City International Airport to Taizhou Luqiao Airport is 12 hours and 55 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Taizhou Luqiao Airport (HYN)

On average, flying from Salt Lake City to Huangyan generates about 794 kg of CO2 per passenger, and 794 kilograms equals 1 751 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Huangyan

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Taizhou Luqiao Airport (HYN).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Taizhou Luqiao Airport
City: Huangyan
Country: China Flag of China
IATA Code: HYN
ICAO Code: ZSLQ
Coordinates: 28°33′43″N, 121°25′44″E