How far is San Borja from Puerto Inírida?
The distance between Puerto Inírida (César Gaviria Trujillo Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 1288 miles / 2074 kilometers / 1120 nautical miles.
César Gaviria Trujillo Airport – Capitán Germán Quiroga Guardia Airport
Search flights
Distance from Puerto Inírida to San Borja
There are several ways to calculate the distance from Puerto Inírida to San Borja. Here are two standard methods:
Vincenty's formula (applied above)- 1288.431 miles
- 2073.529 kilometers
- 1119.616 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1295.400 miles
- 2084.744 kilometers
- 1125.671 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Puerto Inírida to San Borja?
The estimated flight time from César Gaviria Trujillo Airport to Capitán Germán Quiroga Guardia Airport is 2 hours and 56 minutes.
What is the time difference between Puerto Inírida and San Borja?
Flight carbon footprint between César Gaviria Trujillo Airport (PDA) and Capitán Germán Quiroga Guardia Airport (SRJ)
On average, flying from Puerto Inírida to San Borja generates about 166 kg of CO2 per passenger, and 166 kilograms equals 366 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Puerto Inírida to San Borja
See the map of the shortest flight path between César Gaviria Trujillo Airport (PDA) and Capitán Germán Quiroga Guardia Airport (SRJ).
Airport information
Origin | César Gaviria Trujillo Airport |
---|---|
City: | Puerto Inírida |
Country: | Colombia |
IATA Code: | PDA |
ICAO Code: | SKPD |
Coordinates: | 3°51′12″N, 67°54′22″W |
Destination | Capitán Germán Quiroga Guardia Airport |
---|---|
City: | San Borja |
Country: | Bolivia |
IATA Code: | SRJ |
ICAO Code: | SLSB |
Coordinates: | 14°51′33″S, 66°44′15″W |