Air Miles Calculator logo

How far is Juliaca from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Juliaca (Inca Manco Cápac International Airport) is 2717 miles / 4372 kilometers / 2361 nautical miles.

The driving distance from Ushuaia (USH) to Juliaca (JUL) is 3433 miles / 5525 kilometers, and travel time by car is about 70 hours 35 minutes.

Ushuaia – Malvinas Argentinas International Airport – Inca Manco Cápac International Airport

Distance arrow
2717
Miles
Distance arrow
4372
Kilometers
Distance arrow
2361
Nautical miles

Search flights

Distance from Ushuaia to Juliaca

There are several ways to calculate the distance from Ushuaia to Juliaca. Here are two standard methods:

Vincenty's formula (applied above)
  • 2716.681 miles
  • 4372.074 kilometers
  • 2360.731 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2722.465 miles
  • 4381.383 kilometers
  • 2365.757 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Juliaca?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Inca Manco Cápac International Airport is 5 hours and 38 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Inca Manco Cápac International Airport (JUL)

On average, flying from Ushuaia to Juliaca generates about 301 kg of CO2 per passenger, and 301 kilograms equals 663 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ushuaia to Juliaca

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Inca Manco Cápac International Airport (JUL).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Inca Manco Cápac International Airport
City: Juliaca
Country: Perú Flag of Perú
IATA Code: JUL
ICAO Code: SPJL
Coordinates: 15°28′1″S, 70°9′29″W