Air Miles Calculator logo

How far is Jujuy from Saskatoon?

The distance between Saskatoon (Saskatoon John G. Diefenbaker International Airport) and Jujuy (Gobernador Horacio Guzmán International Airport) is 5839 miles / 9398 kilometers / 5074 nautical miles.

Saskatoon John G. Diefenbaker International Airport – Gobernador Horacio Guzmán International Airport

Distance arrow
5839
Miles
Distance arrow
9398
Kilometers
Distance arrow
5074
Nautical miles

Search flights

Distance from Saskatoon to Jujuy

There are several ways to calculate the distance from Saskatoon to Jujuy. Here are two standard methods:

Vincenty's formula (applied above)
  • 5839.345 miles
  • 9397.515 kilometers
  • 5074.252 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5855.749 miles
  • 9423.915 kilometers
  • 5088.507 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saskatoon to Jujuy?

The estimated flight time from Saskatoon John G. Diefenbaker International Airport to Gobernador Horacio Guzmán International Airport is 11 hours and 33 minutes.

Flight carbon footprint between Saskatoon John G. Diefenbaker International Airport (YXE) and Gobernador Horacio Guzmán International Airport (JUJ)

On average, flying from Saskatoon to Jujuy generates about 695 kg of CO2 per passenger, and 695 kilograms equals 1 532 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Saskatoon to Jujuy

See the map of the shortest flight path between Saskatoon John G. Diefenbaker International Airport (YXE) and Gobernador Horacio Guzmán International Airport (JUJ).

Airport information

Origin Saskatoon John G. Diefenbaker International Airport
City: Saskatoon
Country: Canada Flag of Canada
IATA Code: YXE
ICAO Code: CYXE
Coordinates: 52°10′14″N, 106°41′59″W
Destination Gobernador Horacio Guzmán International Airport
City: Jujuy
Country: Argentina Flag of Argentina
IATA Code: JUJ
ICAO Code: SASJ
Coordinates: 24°23′34″S, 65°5′52″W