How far is London from San Borja?
The distance between San Borja (Capitán Germán Quiroga Guardia Airport) and London (London International Airport) is 4087 miles / 6577 kilometers / 3551 nautical miles.
Capitán Germán Quiroga Guardia Airport – London International Airport
Search flights
Distance from San Borja to London
There are several ways to calculate the distance from San Borja to London. Here are two standard methods:
Vincenty's formula (applied above)- 4086.917 miles
- 6577.256 kilometers
- 3551.434 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4103.284 miles
- 6603.595 kilometers
- 3565.656 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from San Borja to London?
The estimated flight time from Capitán Germán Quiroga Guardia Airport to London International Airport is 8 hours and 14 minutes.
What is the time difference between San Borja and London?
The time difference between San Borja and London is 1 hour. London is 1 hour behind San Borja.
Flight carbon footprint between Capitán Germán Quiroga Guardia Airport (SRJ) and London International Airport (YXU)
On average, flying from San Borja to London generates about 467 kg of CO2 per passenger, and 467 kilograms equals 1 030 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from San Borja to London
See the map of the shortest flight path between Capitán Germán Quiroga Guardia Airport (SRJ) and London International Airport (YXU).
Airport information
Origin | Capitán Germán Quiroga Guardia Airport |
---|---|
City: | San Borja |
Country: | Bolivia |
IATA Code: | SRJ |
ICAO Code: | SLSB |
Coordinates: | 14°51′33″S, 66°44′15″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |