Air Miles Calculator logo

How far is Senai from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Senai (Senai International Airport) is 9304 miles / 14974 kilometers / 8085 nautical miles.

Toronto Pearson International Airport – Senai International Airport

Distance arrow
9304
Miles
Distance arrow
14974
Kilometers
Distance arrow
8085
Nautical miles
Flight time duration
18 h 6 min
CO2 emission
1 194 kg

Search flights

Distance from Toronto to Senai

There are several ways to calculate the distance from Toronto to Senai. Here are two standard methods:

Vincenty's formula (applied above)
  • 9304.281 miles
  • 14973.789 kilometers
  • 8085.199 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9298.933 miles
  • 14965.183 kilometers
  • 8080.552 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Senai?

The estimated flight time from Toronto Pearson International Airport to Senai International Airport is 18 hours and 6 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Senai International Airport (JHB)

On average, flying from Toronto to Senai generates about 1 194 kg of CO2 per passenger, and 1 194 kilograms equals 2 632 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Senai

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Senai International Airport (JHB).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Senai International Airport
City: Senai
Country: Malaysia Flag of Malaysia
IATA Code: JHB
ICAO Code: WMKJ
Coordinates: 1°38′28″N, 103°40′11″E