How far is Saskatoon from Altai?
The distance between Altai (Altai Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 5510 miles / 8867 kilometers / 4788 nautical miles.
Altai Airport – Saskatoon John G. Diefenbaker International Airport
Search flights
Distance from Altai to Saskatoon
There are several ways to calculate the distance from Altai to Saskatoon. Here are two standard methods:
Vincenty's formula (applied above)- 5509.879 miles
- 8867.290 kilometers
- 4787.954 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5493.739 miles
- 8841.316 kilometers
- 4773.929 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Altai to Saskatoon?
The estimated flight time from Altai Airport to Saskatoon John G. Diefenbaker International Airport is 10 hours and 55 minutes.
What is the time difference between Altai and Saskatoon?
The time difference between Altai and Saskatoon is 13 hours. Saskatoon is 13 hours behind Altai.
Flight carbon footprint between Altai Airport (LTI) and Saskatoon John G. Diefenbaker International Airport (YXE)
On average, flying from Altai to Saskatoon generates about 651 kg of CO2 per passenger, and 651 kilograms equals 1 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Altai to Saskatoon
See the map of the shortest flight path between Altai Airport (LTI) and Saskatoon John G. Diefenbaker International Airport (YXE).
Airport information
Origin | Altai Airport |
---|---|
City: | Altai |
Country: | Mongolia ![]() |
IATA Code: | LTI |
ICAO Code: | ZMAT |
Coordinates: | 46°22′35″N, 96°13′15″E |
Destination | Saskatoon John G. Diefenbaker International Airport |
---|---|
City: | Saskatoon |
Country: | Canada ![]() |
IATA Code: | YXE |
ICAO Code: | CYXE |
Coordinates: | 52°10′14″N, 106°41′59″W |