Air Miles Calculator logo

How far is Ushuaia from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 6817 miles / 10970 kilometers / 5924 nautical miles.

Toronto Pearson International Airport – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
6817
Miles
Distance arrow
10970
Kilometers
Distance arrow
5924
Nautical miles

Search flights

Distance from Toronto to Ushuaia

There are several ways to calculate the distance from Toronto to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 6816.731 miles
  • 10970.465 kilometers
  • 5923.577 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6839.653 miles
  • 11007.355 kilometers
  • 5943.496 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Ushuaia?

The estimated flight time from Toronto Pearson International Airport to Ushuaia – Malvinas Argentinas International Airport is 13 hours and 24 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from Toronto to Ushuaia generates about 829 kg of CO2 per passenger, and 829 kilograms equals 1 828 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Ushuaia

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W