Air Miles Calculator logo

How far is Beijing from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Beijing (Beijing Nanyuan Airport) is 6109 miles / 9831 kilometers / 5308 nautical miles.

Salt Lake City International Airport – Beijing Nanyuan Airport

Distance arrow
6109
Miles
Distance arrow
9831
Kilometers
Distance arrow
5308
Nautical miles

Search flights

Distance from Salt Lake City to Beijing

There are several ways to calculate the distance from Salt Lake City to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 6108.522 miles
  • 9830.713 kilometers
  • 5308.161 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6093.848 miles
  • 9807.098 kilometers
  • 5295.409 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Beijing?

The estimated flight time from Salt Lake City International Airport to Beijing Nanyuan Airport is 12 hours and 3 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Beijing Nanyuan Airport (NAY)

On average, flying from Salt Lake City to Beijing generates about 731 kg of CO2 per passenger, and 731 kilograms equals 1 612 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Beijing

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E