How far is San Borja from Tampa, FL?
The distance between Tampa (Tampa International Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 3129 miles / 5036 kilometers / 2719 nautical miles.
Tampa International Airport – Capitán Germán Quiroga Guardia Airport
Search flights
Distance from Tampa to San Borja
There are several ways to calculate the distance from Tampa to San Borja. Here are two standard methods:
Vincenty's formula (applied above)- 3129.280 miles
- 5036.088 kilometers
- 2719.270 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3142.778 miles
- 5057.812 kilometers
- 2731.000 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tampa to San Borja?
The estimated flight time from Tampa International Airport to Capitán Germán Quiroga Guardia Airport is 6 hours and 25 minutes.
What is the time difference between Tampa and San Borja?
The time difference between Tampa and San Borja is 1 hour. San Borja is 1 hour ahead of Tampa.
Flight carbon footprint between Tampa International Airport (TPA) and Capitán Germán Quiroga Guardia Airport (SRJ)
On average, flying from Tampa to San Borja generates about 350 kg of CO2 per passenger, and 350 kilograms equals 771 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tampa to San Borja
See the map of the shortest flight path between Tampa International Airport (TPA) and Capitán Germán Quiroga Guardia Airport (SRJ).
Airport information
Origin | Tampa International Airport |
---|---|
City: | Tampa, FL |
Country: | United States |
IATA Code: | TPA |
ICAO Code: | KTPA |
Coordinates: | 27°58′31″N, 82°31′59″W |
Destination | Capitán Germán Quiroga Guardia Airport |
---|---|
City: | San Borja |
Country: | Bolivia |
IATA Code: | SRJ |
ICAO Code: | SLSB |
Coordinates: | 14°51′33″S, 66°44′15″W |