Air Miles Calculator logo

How far is Toronto from Shenzhen?

The distance between Shenzhen (Shenzhen Bao'an International Airport) and Toronto (Billy Bishop Toronto City Airport) is 7795 miles / 12544 kilometers / 6773 nautical miles.

Shenzhen Bao'an International Airport – Billy Bishop Toronto City Airport

Distance arrow
7795
Miles
Distance arrow
12544
Kilometers
Distance arrow
6773
Nautical miles

Search flights

Distance from Shenzhen to Toronto

There are several ways to calculate the distance from Shenzhen to Toronto. Here are two standard methods:

Vincenty's formula (applied above)
  • 7794.773 miles
  • 12544.472 kilometers
  • 6773.473 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7782.118 miles
  • 12524.105 kilometers
  • 6762.476 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shenzhen to Toronto?

The estimated flight time from Shenzhen Bao'an International Airport to Billy Bishop Toronto City Airport is 15 hours and 15 minutes.

Flight carbon footprint between Shenzhen Bao'an International Airport (SZX) and Billy Bishop Toronto City Airport (YTZ)

On average, flying from Shenzhen to Toronto generates about 968 kg of CO2 per passenger, and 968 kilograms equals 2 135 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shenzhen to Toronto

See the map of the shortest flight path between Shenzhen Bao'an International Airport (SZX) and Billy Bishop Toronto City Airport (YTZ).

Airport information

Origin Shenzhen Bao'an International Airport
City: Shenzhen
Country: China Flag of China
IATA Code: SZX
ICAO Code: ZGSZ
Coordinates: 22°38′21″N, 113°48′39″E
Destination Billy Bishop Toronto City Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YTZ
ICAO Code: CYTZ
Coordinates: 43°37′38″N, 79°23′46″W