Air Miles Calculator logo

How far is San Borja from Pontes e Lacerda?

The distance between Pontes e Lacerda (Pontes e Lacerda Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 492 miles / 792 kilometers / 427 nautical miles.

The driving distance from Pontes e Lacerda (LCB) to San Borja (SRJ) is 719 miles / 1157 kilometers, and travel time by car is about 23 hours 16 minutes.

Pontes e Lacerda Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
492
Miles
Distance arrow
792
Kilometers
Distance arrow
427
Nautical miles

Search flights

Distance from Pontes e Lacerda to San Borja

There are several ways to calculate the distance from Pontes e Lacerda to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 491.826 miles
  • 791.516 kilometers
  • 427.385 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 491.172 miles
  • 790.465 kilometers
  • 426.817 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pontes e Lacerda to San Borja?

The estimated flight time from Pontes e Lacerda Airport to Capitán Germán Quiroga Guardia Airport is 1 hour and 25 minutes.

What is the time difference between Pontes e Lacerda and San Borja?

There is no time difference between Pontes e Lacerda and San Borja.

Flight carbon footprint between Pontes e Lacerda Airport (LCB) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Pontes e Lacerda to San Borja generates about 97 kg of CO2 per passenger, and 97 kilograms equals 215 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Pontes e Lacerda to San Borja

See the map of the shortest flight path between Pontes e Lacerda Airport (LCB) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W