How far is Toronto from Nanjing?
The distance between Nanjing (Nanjing Lukou International Airport) and Toronto (Billy Bishop Toronto City Airport) is 7118 miles / 11455 kilometers / 6185 nautical miles.
Nanjing Lukou International Airport – Billy Bishop Toronto City Airport
Search flights
Distance from Nanjing to Toronto
There are several ways to calculate the distance from Nanjing to Toronto. Here are two standard methods:
Vincenty's formula (applied above)- 7117.722 miles
- 11454.863 kilometers
- 6185.131 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7102.984 miles
- 11431.145 kilometers
- 6172.325 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nanjing to Toronto?
The estimated flight time from Nanjing Lukou International Airport to Billy Bishop Toronto City Airport is 13 hours and 58 minutes.
What is the time difference between Nanjing and Toronto?
The time difference between Nanjing and Toronto is 13 hours. Toronto is 13 hours behind Nanjing.
Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Billy Bishop Toronto City Airport (YTZ)
On average, flying from Nanjing to Toronto generates about 871 kg of CO2 per passenger, and 871 kilograms equals 1 921 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Nanjing to Toronto
See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Billy Bishop Toronto City Airport (YTZ).
Airport information
Origin | Nanjing Lukou International Airport |
---|---|
City: | Nanjing |
Country: | China |
IATA Code: | NKG |
ICAO Code: | ZSNJ |
Coordinates: | 31°44′31″N, 118°51′43″E |
Destination | Billy Bishop Toronto City Airport |
---|---|
City: | Toronto |
Country: | Canada |
IATA Code: | YTZ |
ICAO Code: | CYTZ |
Coordinates: | 43°37′38″N, 79°23′46″W |