Air Miles Calculator logo

How far is Cobija from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Cobija (Captain Aníbal Arab Airport) is 3019 miles / 4859 kilometers / 2624 nautical miles.

The driving distance from Ushuaia (USH) to Cobija (CIJ) is 3919 miles / 6307 kilometers, and travel time by car is about 84 hours 47 minutes.

Ushuaia – Malvinas Argentinas International Airport – Captain Aníbal Arab Airport

Distance arrow
3019
Miles
Distance arrow
4859
Kilometers
Distance arrow
2624
Nautical miles

Search flights

Distance from Ushuaia to Cobija

There are several ways to calculate the distance from Ushuaia to Cobija. Here are two standard methods:

Vincenty's formula (applied above)
  • 3019.268 miles
  • 4859.041 kilometers
  • 2623.672 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3026.609 miles
  • 4870.855 kilometers
  • 2630.051 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Cobija?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Captain Aníbal Arab Airport is 6 hours and 12 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Captain Aníbal Arab Airport (CIJ)

On average, flying from Ushuaia to Cobija generates about 337 kg of CO2 per passenger, and 337 kilograms equals 742 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ushuaia to Cobija

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Captain Aníbal Arab Airport (CIJ).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Captain Aníbal Arab Airport
City: Cobija
Country: Bolivia Flag of Bolivia
IATA Code: CIJ
ICAO Code: SLCO
Coordinates: 11°2′25″S, 68°46′58″W