Air Miles Calculator logo

How far is Qingyang from Tashigang?

The distance between Tashigang (Yongphulla Airport) and Qingyang (Qingyang Xifeng Airport) is 1115 miles / 1794 kilometers / 969 nautical miles.

The driving distance from Tashigang (YON) to Qingyang (IQN) is 2146 miles / 3454 kilometers, and travel time by car is about 43 hours 31 minutes.

Yongphulla Airport – Qingyang Xifeng Airport

Distance arrow
1115
Miles
Distance arrow
1794
Kilometers
Distance arrow
969
Nautical miles

Search flights

Distance from Tashigang to Qingyang

There are several ways to calculate the distance from Tashigang to Qingyang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1114.774 miles
  • 1794.054 kilometers
  • 968.712 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1114.027 miles
  • 1792.853 kilometers
  • 968.063 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tashigang to Qingyang?

The estimated flight time from Yongphulla Airport to Qingyang Xifeng Airport is 2 hours and 36 minutes.

Flight carbon footprint between Yongphulla Airport (YON) and Qingyang Xifeng Airport (IQN)

On average, flying from Tashigang to Qingyang generates about 158 kg of CO2 per passenger, and 158 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tashigang to Qingyang

See the map of the shortest flight path between Yongphulla Airport (YON) and Qingyang Xifeng Airport (IQN).

Airport information

Origin Yongphulla Airport
City: Tashigang
Country: Bhutan Flag of Bhutan
IATA Code: YON
ICAO Code: VQTY
Coordinates: 27°15′23″N, 91°30′52″E
Destination Qingyang Xifeng Airport
City: Qingyang
Country: China Flag of China
IATA Code: IQN
ICAO Code: ZLQY
Coordinates: 35°47′58″N, 107°36′10″E