Air Miles Calculator logo

How far is Trincomalee from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Trincomalee (China Bay Airport) is 8642 miles / 13908 kilometers / 7510 nautical miles.

Toronto Pearson International Airport – China Bay Airport

Distance arrow
8642
Miles
Distance arrow
13908
Kilometers
Distance arrow
7510
Nautical miles
Flight time duration
16 h 51 min
Time Difference
10 h 30 min
CO2 emission
1 093 kg

Search flights

Distance from Toronto to Trincomalee

There are several ways to calculate the distance from Toronto to Trincomalee. Here are two standard methods:

Vincenty's formula (applied above)
  • 8641.990 miles
  • 13907.935 kilometers
  • 7509.684 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8633.690 miles
  • 13894.578 kilometers
  • 7502.472 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Trincomalee?

The estimated flight time from Toronto Pearson International Airport to China Bay Airport is 16 hours and 51 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and China Bay Airport (TRR)

On average, flying from Toronto to Trincomalee generates about 1 093 kg of CO2 per passenger, and 1 093 kilograms equals 2 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Trincomalee

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and China Bay Airport (TRR).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination China Bay Airport
City: Trincomalee
Country: Sri Lanka Flag of Sri Lanka
IATA Code: TRR
ICAO Code: VCCT
Coordinates: 8°32′18″N, 81°10′54″E