Air Miles Calculator logo

How far is Lethbridge from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Lethbridge (Lethbridge Airport) is 7643 miles / 12300 kilometers / 6641 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Lethbridge Airport

Distance arrow
7643
Miles
Distance arrow
12300
Kilometers
Distance arrow
6641
Nautical miles

Search flights

Distance from Ushuaia to Lethbridge

There are several ways to calculate the distance from Ushuaia to Lethbridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 7642.804 miles
  • 12299.901 kilometers
  • 6641.415 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7663.149 miles
  • 12332.643 kilometers
  • 6659.095 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Lethbridge?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Lethbridge Airport is 14 hours and 58 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Lethbridge Airport (YQL)

On average, flying from Ushuaia to Lethbridge generates about 947 kg of CO2 per passenger, and 947 kilograms equals 2 087 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Lethbridge

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Lethbridge Airport (YQL).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Lethbridge Airport
City: Lethbridge
Country: Canada Flag of Canada
IATA Code: YQL
ICAO Code: CYQL
Coordinates: 49°37′49″N, 112°48′0″W