Air Miles Calculator logo

How far is Saskatoon from Bangkok?

The distance between Bangkok (Suvarnabhumi Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 7610 miles / 12247 kilometers / 6613 nautical miles.

Suvarnabhumi Airport – Saskatoon John G. Diefenbaker International Airport

Distance arrow
7610
Miles
Distance arrow
12247
Kilometers
Distance arrow
6613
Nautical miles

Search flights

Distance from Bangkok to Saskatoon

There are several ways to calculate the distance from Bangkok to Saskatoon. Here are two standard methods:

Vincenty's formula (applied above)
  • 7609.971 miles
  • 12247.062 kilometers
  • 6612.884 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7600.275 miles
  • 12231.456 kilometers
  • 6604.458 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangkok to Saskatoon?

The estimated flight time from Suvarnabhumi Airport to Saskatoon John G. Diefenbaker International Airport is 14 hours and 54 minutes.

Flight carbon footprint between Suvarnabhumi Airport (BKK) and Saskatoon John G. Diefenbaker International Airport (YXE)

On average, flying from Bangkok to Saskatoon generates about 942 kg of CO2 per passenger, and 942 kilograms equals 2 076 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bangkok to Saskatoon

See the map of the shortest flight path between Suvarnabhumi Airport (BKK) and Saskatoon John G. Diefenbaker International Airport (YXE).

Airport information

Origin Suvarnabhumi Airport
City: Bangkok
Country: Thailand Flag of Thailand
IATA Code: BKK
ICAO Code: VTBS
Coordinates: 13°40′51″N, 100°44′49″E
Destination Saskatoon John G. Diefenbaker International Airport
City: Saskatoon
Country: Canada Flag of Canada
IATA Code: YXE
ICAO Code: CYXE
Coordinates: 52°10′14″N, 106°41′59″W