Air Miles Calculator logo

How far is Qian Gorlos Mongol Autonomous County from Ukhta?

The distance between Ukhta (Ukhta Airport) and Qian Gorlos Mongol Autonomous County (Songyuan Chaganhu Airport) is 2950 miles / 4747 kilometers / 2563 nautical miles.

Ukhta Airport – Songyuan Chaganhu Airport

Distance arrow
2950
Miles
Distance arrow
4747
Kilometers
Distance arrow
2563
Nautical miles

Search flights

Distance from Ukhta to Qian Gorlos Mongol Autonomous County

There are several ways to calculate the distance from Ukhta to Qian Gorlos Mongol Autonomous County. Here are two standard methods:

Vincenty's formula (applied above)
  • 2949.910 miles
  • 4747.420 kilometers
  • 2563.402 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2941.548 miles
  • 4733.963 kilometers
  • 2556.135 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ukhta to Qian Gorlos Mongol Autonomous County?

The estimated flight time from Ukhta Airport to Songyuan Chaganhu Airport is 6 hours and 5 minutes.

Flight carbon footprint between Ukhta Airport (UCT) and Songyuan Chaganhu Airport (YSQ)

On average, flying from Ukhta to Qian Gorlos Mongol Autonomous County generates about 328 kg of CO2 per passenger, and 328 kilograms equals 724 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ukhta to Qian Gorlos Mongol Autonomous County

See the map of the shortest flight path between Ukhta Airport (UCT) and Songyuan Chaganhu Airport (YSQ).

Airport information

Origin Ukhta Airport
City: Ukhta
Country: Russia Flag of Russia
IATA Code: UCT
ICAO Code: UUYH
Coordinates: 63°34′0″N, 53°48′16″E
Destination Songyuan Chaganhu Airport
City: Qian Gorlos Mongol Autonomous County
Country: China Flag of China
IATA Code: YSQ
ICAO Code: ZYSQ
Coordinates: 44°56′17″N, 124°33′0″E