Air Miles Calculator logo

How far is Yining from Ashgabat?

The distance between Ashgabat (Ashgabat International Airport) and Yining (Yining Airport) is 1265 miles / 2036 kilometers / 1099 nautical miles.

The driving distance from Ashgabat (ASB) to Yining (YIN) is 1616 miles / 2600 kilometers, and travel time by car is about 34 hours 12 minutes.

Ashgabat International Airport – Yining Airport

Distance arrow
1265
Miles
Distance arrow
2036
Kilometers
Distance arrow
1099
Nautical miles

Search flights

Distance from Ashgabat to Yining

There are several ways to calculate the distance from Ashgabat to Yining. Here are two standard methods:

Vincenty's formula (applied above)
  • 1265.003 miles
  • 2035.825 kilometers
  • 1099.257 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1262.287 miles
  • 2031.453 kilometers
  • 1096.897 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ashgabat to Yining?

The estimated flight time from Ashgabat International Airport to Yining Airport is 2 hours and 53 minutes.

Flight carbon footprint between Ashgabat International Airport (ASB) and Yining Airport (YIN)

On average, flying from Ashgabat to Yining generates about 165 kg of CO2 per passenger, and 165 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ashgabat to Yining

See the map of the shortest flight path between Ashgabat International Airport (ASB) and Yining Airport (YIN).

Airport information

Origin Ashgabat International Airport
City: Ashgabat
Country: Turkmenistan Flag of Turkmenistan
IATA Code: ASB
ICAO Code: UTAA
Coordinates: 37°59′12″N, 58°21′39″E
Destination Yining Airport
City: Yining
Country: China Flag of China
IATA Code: YIN
ICAO Code: ZWYN
Coordinates: 43°57′20″N, 81°19′49″E