Air Miles Calculator logo

How far is Batagay from Taiyuan?

The distance between Taiyuan (Taiyuan Wusu International Airport) and Batagay (Batagay Airport) is 2237 miles / 3600 kilometers / 1944 nautical miles.

The driving distance from Taiyuan (TYN) to Batagay (BQJ) is 3244 miles / 5220 kilometers, and travel time by car is about 86 hours 24 minutes.

Taiyuan Wusu International Airport – Batagay Airport

Distance arrow
2237
Miles
Distance arrow
3600
Kilometers
Distance arrow
1944
Nautical miles

Search flights

Distance from Taiyuan to Batagay

There are several ways to calculate the distance from Taiyuan to Batagay. Here are two standard methods:

Vincenty's formula (applied above)
  • 2237.075 miles
  • 3600.223 kilometers
  • 1943.965 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2234.702 miles
  • 3596.404 kilometers
  • 1941.903 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Taiyuan to Batagay?

The estimated flight time from Taiyuan Wusu International Airport to Batagay Airport is 4 hours and 44 minutes.

Flight carbon footprint between Taiyuan Wusu International Airport (TYN) and Batagay Airport (BQJ)

On average, flying from Taiyuan to Batagay generates about 245 kg of CO2 per passenger, and 245 kilograms equals 539 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Taiyuan to Batagay

See the map of the shortest flight path between Taiyuan Wusu International Airport (TYN) and Batagay Airport (BQJ).

Airport information

Origin Taiyuan Wusu International Airport
City: Taiyuan
Country: China Flag of China
IATA Code: TYN
ICAO Code: ZBYN
Coordinates: 37°44′48″N, 112°37′40″E
Destination Batagay Airport
City: Batagay
Country: Russia Flag of Russia
IATA Code: BQJ
ICAO Code: UEBB
Coordinates: 67°38′52″N, 134°41′42″E