Air Miles Calculator logo

How far is Yibin from San Antonio, TX?

The distance between San Antonio (San Antonio International Airport) and Yibin (Yibin Wuliangye Airport) is 8139 miles / 13099 kilometers / 7073 nautical miles.

San Antonio International Airport – Yibin Wuliangye Airport

Distance arrow
8139
Miles
Distance arrow
13099
Kilometers
Distance arrow
7073
Nautical miles
Flight time duration
15 h 54 min
CO2 emission
1 019 kg

Search flights

Distance from San Antonio to Yibin

There are several ways to calculate the distance from San Antonio to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 8139.208 miles
  • 13098.786 kilometers
  • 7072.779 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8126.563 miles
  • 13078.435 kilometers
  • 7061.790 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Antonio to Yibin?

The estimated flight time from San Antonio International Airport to Yibin Wuliangye Airport is 15 hours and 54 minutes.

Flight carbon footprint between San Antonio International Airport (SAT) and Yibin Wuliangye Airport (YBP)

On average, flying from San Antonio to Yibin generates about 1 019 kg of CO2 per passenger, and 1 019 kilograms equals 2 246 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Antonio to Yibin

See the map of the shortest flight path between San Antonio International Airport (SAT) and Yibin Wuliangye Airport (YBP).

Airport information

Origin San Antonio International Airport
City: San Antonio, TX
Country: United States Flag of United States
IATA Code: SAT
ICAO Code: KSAT
Coordinates: 29°32′1″N, 98°28′11″W
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E