Air Miles Calculator logo

How far is Qianjiang from Uliastai?

The distance between Uliastai (Donoi Airport) and Qianjiang (Qianjiang Wulingshan Airport) is 1417 miles / 2280 kilometers / 1231 nautical miles.

The driving distance from Uliastai (ULZ) to Qianjiang (JIQ) is 2062 miles / 3319 kilometers, and travel time by car is about 45 hours 55 minutes.

Donoi Airport – Qianjiang Wulingshan Airport

Distance arrow
1417
Miles
Distance arrow
2280
Kilometers
Distance arrow
1231
Nautical miles

Search flights

Distance from Uliastai to Qianjiang

There are several ways to calculate the distance from Uliastai to Qianjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1416.778 miles
  • 2280.084 kilometers
  • 1231.147 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1417.899 miles
  • 2281.887 kilometers
  • 1232.121 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Uliastai to Qianjiang?

The estimated flight time from Donoi Airport to Qianjiang Wulingshan Airport is 3 hours and 10 minutes.

Flight carbon footprint between Donoi Airport (ULZ) and Qianjiang Wulingshan Airport (JIQ)

On average, flying from Uliastai to Qianjiang generates about 174 kg of CO2 per passenger, and 174 kilograms equals 384 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Uliastai to Qianjiang

See the map of the shortest flight path between Donoi Airport (ULZ) and Qianjiang Wulingshan Airport (JIQ).

Airport information

Origin Donoi Airport
City: Uliastai
Country: Mongolia Flag of Mongolia
IATA Code: ULZ
ICAO Code: ZMDN
Coordinates: 47°42′33″N, 96°31′32″E
Destination Qianjiang Wulingshan Airport
City: Qianjiang
Country: China Flag of China
IATA Code: JIQ
ICAO Code: ZUQJ
Coordinates: 29°30′47″N, 108°49′51″E