Air Miles Calculator logo

How far is Toronto from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Toronto (Billy Bishop Toronto City Airport) is 8267 miles / 13305 kilometers / 7184 nautical miles.

Lanseria International Airport – Billy Bishop Toronto City Airport

Distance arrow
8267
Miles
Distance arrow
13305
Kilometers
Distance arrow
7184
Nautical miles

Search flights

Distance from Johannesburg to Toronto

There are several ways to calculate the distance from Johannesburg to Toronto. Here are two standard methods:

Vincenty's formula (applied above)
  • 8267.410 miles
  • 13305.107 kilometers
  • 7184.183 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8271.154 miles
  • 13311.133 kilometers
  • 7187.437 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Toronto?

The estimated flight time from Lanseria International Airport to Billy Bishop Toronto City Airport is 16 hours and 9 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Billy Bishop Toronto City Airport (YTZ)

On average, flying from Johannesburg to Toronto generates about 1 038 kg of CO2 per passenger, and 1 038 kilograms equals 2 288 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Toronto

See the map of the shortest flight path between Lanseria International Airport (HLA) and Billy Bishop Toronto City Airport (YTZ).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Billy Bishop Toronto City Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YTZ
ICAO Code: CYTZ
Coordinates: 43°37′38″N, 79°23′46″W