Air Miles Calculator logo

How far is San Borja from Sapporo?

The distance between Sapporo (New Chitose Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 9879 miles / 15898 kilometers / 8584 nautical miles.

New Chitose Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
9879
Miles
Distance arrow
15898
Kilometers
Distance arrow
8584
Nautical miles
Flight time duration
19 h 12 min
CO2 emission
1 283 kg

Search flights

Distance from Sapporo to San Borja

There are several ways to calculate the distance from Sapporo to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 9878.749 miles
  • 15898.306 kilometers
  • 8584.398 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9876.671 miles
  • 15894.961 kilometers
  • 8582.592 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sapporo to San Borja?

The estimated flight time from New Chitose Airport to Capitán Germán Quiroga Guardia Airport is 19 hours and 12 minutes.

Flight carbon footprint between New Chitose Airport (CTS) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Sapporo to San Borja generates about 1 283 kg of CO2 per passenger, and 1 283 kilograms equals 2 828 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sapporo to San Borja

See the map of the shortest flight path between New Chitose Airport (CTS) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin New Chitose Airport
City: Sapporo
Country: Japan Flag of Japan
IATA Code: CTS
ICAO Code: RJCC
Coordinates: 42°46′30″N, 141°41′31″E
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W