Air Miles Calculator logo

How far is San Borja from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 2758 miles / 4438 kilometers / 2397 nautical miles.

The driving distance from Ushuaia (USH) to San Borja (SRJ) is 3483 miles / 5605 kilometers, and travel time by car is about 72 hours 37 minutes.

Ushuaia – Malvinas Argentinas International Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
2758
Miles
Distance arrow
4438
Kilometers
Distance arrow
2397
Nautical miles

Search flights

Distance from Ushuaia to San Borja

There are several ways to calculate the distance from Ushuaia to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 2757.912 miles
  • 4438.430 kilometers
  • 2396.560 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2763.906 miles
  • 4448.076 kilometers
  • 2401.769 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to San Borja?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Capitán Germán Quiroga Guardia Airport is 5 hours and 43 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Ushuaia to San Borja generates about 305 kg of CO2 per passenger, and 305 kilograms equals 673 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ushuaia to San Borja

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W