Air Miles Calculator logo

How far is Windsor from San Borja?

The distance between San Borja (Capitán Germán Quiroga Guardia Airport) and Windsor (Windsor International Airport) is 4064 miles / 6541 kilometers / 3532 nautical miles.

Capitán Germán Quiroga Guardia Airport – Windsor International Airport

Distance arrow
4064
Miles
Distance arrow
6541
Kilometers
Distance arrow
3532
Nautical miles

Search flights

Distance from San Borja to Windsor

There are several ways to calculate the distance from San Borja to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 4064.256 miles
  • 6540.785 kilometers
  • 3531.742 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4080.361 miles
  • 6566.704 kilometers
  • 3545.737 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Borja to Windsor?

The estimated flight time from Capitán Germán Quiroga Guardia Airport to Windsor International Airport is 8 hours and 11 minutes.

Flight carbon footprint between Capitán Germán Quiroga Guardia Airport (SRJ) and Windsor International Airport (YQG)

On average, flying from San Borja to Windsor generates about 464 kg of CO2 per passenger, and 464 kilograms equals 1 024 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Borja to Windsor

See the map of the shortest flight path between Capitán Germán Quiroga Guardia Airport (SRJ) and Windsor International Airport (YQG).

Airport information

Origin Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W