Air Miles Calculator logo

How far is Saskatoon from Aberdeen?

The distance between Aberdeen (Aberdeen Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 3778 miles / 6080 kilometers / 3283 nautical miles.

Aberdeen Airport – Saskatoon John G. Diefenbaker International Airport

Distance arrow
3778
Miles
Distance arrow
6080
Kilometers
Distance arrow
3283
Nautical miles

Search flights

Distance from Aberdeen to Saskatoon

There are several ways to calculate the distance from Aberdeen to Saskatoon. Here are two standard methods:

Vincenty's formula (applied above)
  • 3777.916 miles
  • 6079.967 kilometers
  • 3282.919 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3765.429 miles
  • 6059.871 kilometers
  • 3272.068 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Aberdeen to Saskatoon?

The estimated flight time from Aberdeen Airport to Saskatoon John G. Diefenbaker International Airport is 7 hours and 39 minutes.

Flight carbon footprint between Aberdeen Airport (ABZ) and Saskatoon John G. Diefenbaker International Airport (YXE)

On average, flying from Aberdeen to Saskatoon generates about 429 kg of CO2 per passenger, and 429 kilograms equals 945 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Aberdeen to Saskatoon

See the map of the shortest flight path between Aberdeen Airport (ABZ) and Saskatoon John G. Diefenbaker International Airport (YXE).

Airport information

Origin Aberdeen Airport
City: Aberdeen
Country: United Kingdom Flag of United Kingdom
IATA Code: ABZ
ICAO Code: EGPD
Coordinates: 57°12′6″N, 2°11′52″W
Destination Saskatoon John G. Diefenbaker International Airport
City: Saskatoon
Country: Canada Flag of Canada
IATA Code: YXE
ICAO Code: CYXE
Coordinates: 52°10′14″N, 106°41′59″W