Air Miles Calculator logo

How far is Kingston from San Borja?

The distance between San Borja (Capitán Germán Quiroga Guardia Airport) and Kingston (Kingston Norman Rogers Airport) is 4113 miles / 6619 kilometers / 3574 nautical miles.

Capitán Germán Quiroga Guardia Airport – Kingston Norman Rogers Airport

Distance arrow
4113
Miles
Distance arrow
6619
Kilometers
Distance arrow
3574
Nautical miles

Search flights

Distance from San Borja to Kingston

There are several ways to calculate the distance from San Borja to Kingston. Here are two standard methods:

Vincenty's formula (applied above)
  • 4112.553 miles
  • 6618.513 kilometers
  • 3573.711 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4129.387 miles
  • 6645.604 kilometers
  • 3588.339 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Borja to Kingston?

The estimated flight time from Capitán Germán Quiroga Guardia Airport to Kingston Norman Rogers Airport is 8 hours and 17 minutes.

Flight carbon footprint between Capitán Germán Quiroga Guardia Airport (SRJ) and Kingston Norman Rogers Airport (YGK)

On average, flying from San Borja to Kingston generates about 470 kg of CO2 per passenger, and 470 kilograms equals 1 037 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Borja to Kingston

See the map of the shortest flight path between Capitán Germán Quiroga Guardia Airport (SRJ) and Kingston Norman Rogers Airport (YGK).

Airport information

Origin Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W
Destination Kingston Norman Rogers Airport
City: Kingston
Country: Canada Flag of Canada
IATA Code: YGK
ICAO Code: CYGK
Coordinates: 44°13′31″N, 76°35′48″W