Air Miles Calculator logo

How far is Abbotsford from Shenyang?

The distance between Shenyang (Shenyang Taoxian International Airport) and Abbotsford (Abbotsford International Airport) is 5021 miles / 8081 kilometers / 4363 nautical miles.

Shenyang Taoxian International Airport – Abbotsford International Airport

Distance arrow
5021
Miles
Distance arrow
8081
Kilometers
Distance arrow
4363
Nautical miles

Search flights

Distance from Shenyang to Abbotsford

There are several ways to calculate the distance from Shenyang to Abbotsford. Here are two standard methods:

Vincenty's formula (applied above)
  • 5021.222 miles
  • 8080.873 kilometers
  • 4363.322 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5007.565 miles
  • 8058.895 kilometers
  • 4351.455 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shenyang to Abbotsford?

The estimated flight time from Shenyang Taoxian International Airport to Abbotsford International Airport is 10 hours and 0 minutes.

Flight carbon footprint between Shenyang Taoxian International Airport (SHE) and Abbotsford International Airport (YXX)

On average, flying from Shenyang to Abbotsford generates about 587 kg of CO2 per passenger, and 587 kilograms equals 1 293 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shenyang to Abbotsford

See the map of the shortest flight path between Shenyang Taoxian International Airport (SHE) and Abbotsford International Airport (YXX).

Airport information

Origin Shenyang Taoxian International Airport
City: Shenyang
Country: China Flag of China
IATA Code: SHE
ICAO Code: ZYTX
Coordinates: 41°38′23″N, 123°28′58″E
Destination Abbotsford International Airport
City: Abbotsford
Country: Canada Flag of Canada
IATA Code: YXX
ICAO Code: CYXX
Coordinates: 49°1′31″N, 122°21′39″W