Air Miles Calculator logo

How far is Saskatoon from Pasto?

The distance between Pasto (Antonio Nariño Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 3888 miles / 6258 kilometers / 3379 nautical miles.

Antonio Nariño Airport – Saskatoon John G. Diefenbaker International Airport

Distance arrow
3888
Miles
Distance arrow
6258
Kilometers
Distance arrow
3379
Nautical miles

Search flights

Distance from Pasto to Saskatoon

There are several ways to calculate the distance from Pasto to Saskatoon. Here are two standard methods:

Vincenty's formula (applied above)
  • 3888.320 miles
  • 6257.644 kilometers
  • 3378.857 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3896.988 miles
  • 6271.594 kilometers
  • 3386.390 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pasto to Saskatoon?

The estimated flight time from Antonio Nariño Airport to Saskatoon John G. Diefenbaker International Airport is 7 hours and 51 minutes.

Flight carbon footprint between Antonio Nariño Airport (PSO) and Saskatoon John G. Diefenbaker International Airport (YXE)

On average, flying from Pasto to Saskatoon generates about 442 kg of CO2 per passenger, and 442 kilograms equals 975 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pasto to Saskatoon

See the map of the shortest flight path between Antonio Nariño Airport (PSO) and Saskatoon John G. Diefenbaker International Airport (YXE).

Airport information

Origin Antonio Nariño Airport
City: Pasto
Country: Colombia Flag of Colombia
IATA Code: PSO
ICAO Code: SKPS
Coordinates: 1°23′46″N, 77°17′29″W
Destination Saskatoon John G. Diefenbaker International Airport
City: Saskatoon
Country: Canada Flag of Canada
IATA Code: YXE
ICAO Code: CYXE
Coordinates: 52°10′14″N, 106°41′59″W