Air Miles Calculator logo

How far is San Borja from Beijing?

The distance between Beijing (Beijing Capital International Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 10682 miles / 17191 kilometers / 9282 nautical miles.

Beijing Capital International Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
10682
Miles
Distance arrow
17191
Kilometers
Distance arrow
9282
Nautical miles
Flight time duration
20 h 43 min
CO2 emission
1 410 kg

Search flights

Distance from Beijing to San Borja

There are several ways to calculate the distance from Beijing to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 10681.739 miles
  • 17190.593 kilometers
  • 9282.177 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10682.704 miles
  • 17192.145 kilometers
  • 9283.016 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beijing to San Borja?

The estimated flight time from Beijing Capital International Airport to Capitán Germán Quiroga Guardia Airport is 20 hours and 43 minutes.

Flight carbon footprint between Beijing Capital International Airport (PEK) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Beijing to San Borja generates about 1 410 kg of CO2 per passenger, and 1 410 kilograms equals 3 108 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Beijing to San Borja

See the map of the shortest flight path between Beijing Capital International Airport (PEK) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin Beijing Capital International Airport
City: Beijing
Country: China Flag of China
IATA Code: PEK
ICAO Code: ZBAA
Coordinates: 40°4′48″N, 116°35′5″E
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W