Air Miles Calculator logo

How far is Higuerote from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Higuerote (Higuerote Airport) is 4499 miles / 7240 kilometers / 3909 nautical miles.

The driving distance from Ushuaia (USH) to Higuerote (HGE) is 6788 miles / 10925 kilometers, and travel time by car is about 147 hours 3 minutes.

Ushuaia – Malvinas Argentinas International Airport – Higuerote Airport

Distance arrow
4499
Miles
Distance arrow
7240
Kilometers
Distance arrow
3909
Nautical miles

Search flights

Distance from Ushuaia to Higuerote

There are several ways to calculate the distance from Ushuaia to Higuerote. Here are two standard methods:

Vincenty's formula (applied above)
  • 4498.572 miles
  • 7239.750 kilometers
  • 3909.152 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4514.023 miles
  • 7264.615 kilometers
  • 3922.578 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Higuerote?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Higuerote Airport is 9 hours and 1 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Higuerote Airport (HGE)

On average, flying from Ushuaia to Higuerote generates about 519 kg of CO2 per passenger, and 519 kilograms equals 1 145 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ushuaia to Higuerote

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Higuerote Airport (HGE).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Higuerote Airport
City: Higuerote
Country: Venezuela Flag of Venezuela
IATA Code: HGE
ICAO Code: SVHG
Coordinates: 10°27′44″N, 66°5′34″W