How far is Saskatoon from Barcelona?
The distance between Barcelona (Barcelona–El Prat Airport) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 4719 miles / 7594 kilometers / 4100 nautical miles.
Barcelona–El Prat Airport – Saskatoon John G. Diefenbaker International Airport
Search flights
Distance from Barcelona to Saskatoon
There are several ways to calculate the distance from Barcelona to Saskatoon. Here are two standard methods:
Vincenty's formula (applied above)- 4718.501 miles
- 7593.691 kilometers
- 4100.265 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 4705.472 miles
- 7572.724 kilometers
- 4088.944 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Barcelona to Saskatoon?
The estimated flight time from Barcelona–El Prat Airport to Saskatoon John G. Diefenbaker International Airport is 9 hours and 26 minutes.
What is the time difference between Barcelona and Saskatoon?
Flight carbon footprint between Barcelona–El Prat Airport (BCN) and Saskatoon John G. Diefenbaker International Airport (YXE)
On average, flying from Barcelona to Saskatoon generates about 547 kg of CO2 per passenger, and 547 kilograms equals 1 207 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Barcelona to Saskatoon
See the map of the shortest flight path between Barcelona–El Prat Airport (BCN) and Saskatoon John G. Diefenbaker International Airport (YXE).
Airport information
Origin | Barcelona–El Prat Airport |
---|---|
City: | Barcelona |
Country: | Spain |
IATA Code: | BCN |
ICAO Code: | LEBL |
Coordinates: | 41°17′49″N, 2°4′42″E |
Destination | Saskatoon John G. Diefenbaker International Airport |
---|---|
City: | Saskatoon |
Country: | Canada |
IATA Code: | YXE |
ICAO Code: | CYXE |
Coordinates: | 52°10′14″N, 106°41′59″W |