Air Miles Calculator logo

How far is Shihezi from Ust-Kuyga?

The distance between Ust-Kuyga (Ust-Kuyga Airport) and Shihezi (Shihezi Huayuan Airport) is 2460 miles / 3959 kilometers / 2138 nautical miles.

The driving distance from Ust-Kuyga (UKG) to Shihezi (SHF) is 4332 miles / 6971 kilometers, and travel time by car is about 118 hours 48 minutes.

Ust-Kuyga Airport – Shihezi Huayuan Airport

Distance arrow
2460
Miles
Distance arrow
3959
Kilometers
Distance arrow
2138
Nautical miles

Search flights

Distance from Ust-Kuyga to Shihezi

There are several ways to calculate the distance from Ust-Kuyga to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2460.222 miles
  • 3959.343 kilometers
  • 2137.874 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2454.433 miles
  • 3950.027 kilometers
  • 2132.844 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ust-Kuyga to Shihezi?

The estimated flight time from Ust-Kuyga Airport to Shihezi Huayuan Airport is 5 hours and 9 minutes.

Flight carbon footprint between Ust-Kuyga Airport (UKG) and Shihezi Huayuan Airport (SHF)

On average, flying from Ust-Kuyga to Shihezi generates about 271 kg of CO2 per passenger, and 271 kilograms equals 596 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ust-Kuyga to Shihezi

See the map of the shortest flight path between Ust-Kuyga Airport (UKG) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Ust-Kuyga Airport
City: Ust-Kuyga
Country: Russia Flag of Russia
IATA Code: UKG
ICAO Code: UEBT
Coordinates: 70°0′39″N, 135°38′42″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E