Air Miles Calculator logo

How far is San Borja from Santa Teresita?

The distance between Santa Teresita (Santa Teresita Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 1616 miles / 2600 kilometers / 1404 nautical miles.

The driving distance from Santa Teresita (SST) to San Borja (SRJ) is 2147 miles / 3455 kilometers, and travel time by car is about 45 hours 2 minutes.

Santa Teresita Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
1616
Miles
Distance arrow
2600
Kilometers
Distance arrow
1404
Nautical miles

Search flights

Distance from Santa Teresita to San Borja

There are several ways to calculate the distance from Santa Teresita to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 1615.638 miles
  • 2600.118 kilometers
  • 1403.951 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1620.247 miles
  • 2607.535 kilometers
  • 1407.956 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Teresita to San Borja?

The estimated flight time from Santa Teresita Airport to Capitán Germán Quiroga Guardia Airport is 3 hours and 33 minutes.

Flight carbon footprint between Santa Teresita Airport (SST) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Santa Teresita to San Borja generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Santa Teresita to San Borja

See the map of the shortest flight path between Santa Teresita Airport (SST) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin Santa Teresita Airport
City: Santa Teresita
Country: Argentina Flag of Argentina
IATA Code: SST
ICAO Code: SAZL
Coordinates: 36°32′32″S, 56°43′18″W
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W