How far is Saskatoon from Stockholm?
The distance between Stockholm (Stockholm Arlanda Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 4134 miles / 6653 kilometers / 3592 nautical miles.
Stockholm Arlanda Airport – Saskatoon John G. Diefenbaker International Airport
Search flights
Distance from Stockholm to Saskatoon
There are several ways to calculate the distance from Stockholm to Saskatoon. Here are two standard methods:
Vincenty's formula (applied above)- 4133.929 miles
- 6652.914 kilometers
- 3592.286 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4120.106 miles
- 6630.668 kilometers
- 3580.274 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Stockholm to Saskatoon?
The estimated flight time from Stockholm Arlanda Airport to Saskatoon John G. Diefenbaker International Airport is 8 hours and 19 minutes.
What is the time difference between Stockholm and Saskatoon?
Flight carbon footprint between Stockholm Arlanda Airport (ARN) and Saskatoon John G. Diefenbaker International Airport (YXE)
On average, flying from Stockholm to Saskatoon generates about 473 kg of CO2 per passenger, and 473 kilograms equals 1 043 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Stockholm to Saskatoon
See the map of the shortest flight path between Stockholm Arlanda Airport (ARN) and Saskatoon John G. Diefenbaker International Airport (YXE).
Airport information
Origin | Stockholm Arlanda Airport |
---|---|
City: | Stockholm |
Country: | Sweden |
IATA Code: | ARN |
ICAO Code: | ESSA |
Coordinates: | 59°39′6″N, 17°55′6″E |
Destination | Saskatoon John G. Diefenbaker International Airport |
---|---|
City: | Saskatoon |
Country: | Canada |
IATA Code: | YXE |
ICAO Code: | CYXE |
Coordinates: | 52°10′14″N, 106°41′59″W |