Air Miles Calculator logo

How far is Bangda from San Antonio, TX?

The distance between San Antonio (San Antonio International Airport) and Bangda (Qamdo Bamda Airport) is 8173 miles / 13154 kilometers / 7102 nautical miles.

San Antonio International Airport – Qamdo Bamda Airport

Distance arrow
8173
Miles
Distance arrow
13154
Kilometers
Distance arrow
7102
Nautical miles
Flight time duration
15 h 58 min
CO2 emission
1 024 kg

Search flights

Distance from San Antonio to Bangda

There are several ways to calculate the distance from San Antonio to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 8173.377 miles
  • 13153.776 kilometers
  • 7102.471 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8160.579 miles
  • 13133.180 kilometers
  • 7091.350 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Antonio to Bangda?

The estimated flight time from San Antonio International Airport to Qamdo Bamda Airport is 15 hours and 58 minutes.

Flight carbon footprint between San Antonio International Airport (SAT) and Qamdo Bamda Airport (BPX)

On average, flying from San Antonio to Bangda generates about 1 024 kg of CO2 per passenger, and 1 024 kilograms equals 2 257 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Antonio to Bangda

See the map of the shortest flight path between San Antonio International Airport (SAT) and Qamdo Bamda Airport (BPX).

Airport information

Origin San Antonio International Airport
City: San Antonio, TX
Country: United States Flag of United States
IATA Code: SAT
ICAO Code: KSAT
Coordinates: 29°32′1″N, 98°28′11″W
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E