Air Miles Calculator logo

How far is Barcaldine from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Barcaldine (Barcaldine Airport) is 6678 miles / 10747 kilometers / 5803 nautical miles.

Ushuaia – Malvinas Argentinas International Airport – Barcaldine Airport

Distance arrow
6678
Miles
Distance arrow
10747
Kilometers
Distance arrow
5803
Nautical miles

Search flights

Distance from Ushuaia to Barcaldine

There are several ways to calculate the distance from Ushuaia to Barcaldine. Here are two standard methods:

Vincenty's formula (applied above)
  • 6677.869 miles
  • 10746.988 kilometers
  • 5802.909 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6665.643 miles
  • 10727.312 kilometers
  • 5792.285 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Barcaldine?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Barcaldine Airport is 13 hours and 8 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Barcaldine Airport (BCI)

On average, flying from Ushuaia to Barcaldine generates about 810 kg of CO2 per passenger, and 810 kilograms equals 1 785 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ushuaia to Barcaldine

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Barcaldine Airport (BCI).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Barcaldine Airport
City: Barcaldine
Country: Australia Flag of Australia
IATA Code: BCI
ICAO Code: YBAR
Coordinates: 23°33′55″S, 145°18′25″E