How far is Semey from Saskatoon?
The distance between Saskatoon (Saskatoon John G. Diefenbaker International Airport) and Semey (Semey Airport) is 5358 miles / 8623 kilometers / 4656 nautical miles.
Saskatoon John G. Diefenbaker International Airport – Semey Airport
Search flights
Distance from Saskatoon to Semey
There are several ways to calculate the distance from Saskatoon to Semey. Here are two standard methods:
Vincenty's formula (applied above)- 5358.035 miles
- 8622.922 kilometers
- 4656.006 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5341.601 miles
- 8596.473 kilometers
- 4641.724 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Saskatoon to Semey?
The estimated flight time from Saskatoon John G. Diefenbaker International Airport to Semey Airport is 10 hours and 38 minutes.
What is the time difference between Saskatoon and Semey?
The time difference between Saskatoon and Semey is 11 hours. Semey is 11 hours ahead of Saskatoon.
Flight carbon footprint between Saskatoon John G. Diefenbaker International Airport (YXE) and Semey Airport (PLX)
On average, flying from Saskatoon to Semey generates about 631 kg of CO2 per passenger, and 631 kilograms equals 1 391 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Saskatoon to Semey
See the map of the shortest flight path between Saskatoon John G. Diefenbaker International Airport (YXE) and Semey Airport (PLX).
Airport information
Origin | Saskatoon John G. Diefenbaker International Airport |
---|---|
City: | Saskatoon |
Country: | Canada |
IATA Code: | YXE |
ICAO Code: | CYXE |
Coordinates: | 52°10′14″N, 106°41′59″W |
Destination | Semey Airport |
---|---|
City: | Semey |
Country: | Kazakhstan |
IATA Code: | PLX |
ICAO Code: | UASS |
Coordinates: | 50°21′4″N, 80°14′3″E |