Air Miles Calculator logo

How far is Zhanjiang from San Antonio, TX?

The distance between San Antonio (San Antonio International Airport) and Zhanjiang (Zhanjiang Airport) is 8451 miles / 13601 kilometers / 7344 nautical miles.

San Antonio International Airport – Zhanjiang Airport

Distance arrow
8451
Miles
Distance arrow
13601
Kilometers
Distance arrow
7344
Nautical miles
Flight time duration
16 h 30 min
CO2 emission
1 065 kg

Search flights

Distance from San Antonio to Zhanjiang

There are several ways to calculate the distance from San Antonio to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 8451.484 miles
  • 13601.344 kilometers
  • 7344.138 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8440.114 miles
  • 13583.047 kilometers
  • 7334.259 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Antonio to Zhanjiang?

The estimated flight time from San Antonio International Airport to Zhanjiang Airport is 16 hours and 30 minutes.

Flight carbon footprint between San Antonio International Airport (SAT) and Zhanjiang Airport (ZHA)

On average, flying from San Antonio to Zhanjiang generates about 1 065 kg of CO2 per passenger, and 1 065 kilograms equals 2 348 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Antonio to Zhanjiang

See the map of the shortest flight path between San Antonio International Airport (SAT) and Zhanjiang Airport (ZHA).

Airport information

Origin San Antonio International Airport
City: San Antonio, TX
Country: United States Flag of United States
IATA Code: SAT
ICAO Code: KSAT
Coordinates: 29°32′1″N, 98°28′11″W
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E