Air Miles Calculator logo

How far is Bangda from Bayankhongor?

The distance between Bayankhongor (Bayankhongor Airport) and Bangda (Qamdo Bamda Airport) is 1094 miles / 1761 kilometers / 951 nautical miles.

The driving distance from Bayankhongor (BYN) to Bangda (BPX) is 1658 miles / 2668 kilometers, and travel time by car is about 39 hours 7 minutes.

Bayankhongor Airport – Qamdo Bamda Airport

Distance arrow
1094
Miles
Distance arrow
1761
Kilometers
Distance arrow
951
Nautical miles

Search flights

Distance from Bayankhongor to Bangda

There are several ways to calculate the distance from Bayankhongor to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 1093.935 miles
  • 1760.517 kilometers
  • 950.603 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1095.669 miles
  • 1763.308 kilometers
  • 952.110 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayankhongor to Bangda?

The estimated flight time from Bayankhongor Airport to Qamdo Bamda Airport is 2 hours and 34 minutes.

Flight carbon footprint between Bayankhongor Airport (BYN) and Qamdo Bamda Airport (BPX)

On average, flying from Bayankhongor to Bangda generates about 156 kg of CO2 per passenger, and 156 kilograms equals 345 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bayankhongor to Bangda

See the map of the shortest flight path between Bayankhongor Airport (BYN) and Qamdo Bamda Airport (BPX).

Airport information

Origin Bayankhongor Airport
City: Bayankhongor
Country: Mongolia Flag of Mongolia
IATA Code: BYN
ICAO Code: ZMBH
Coordinates: 46°9′47″N, 100°42′14″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E