Air Miles Calculator logo

How far is Jujuy from Neiva?

The distance between Neiva (Benito Salas Airport) and Jujuy (Gobernador Horacio Guzmán International Airport) is 2001 miles / 3220 kilometers / 1739 nautical miles.

The driving distance from Neiva (NVA) to Jujuy (JUJ) is 3110 miles / 5005 kilometers, and travel time by car is about 70 hours 4 minutes.

Benito Salas Airport – Gobernador Horacio Guzmán International Airport

Distance arrow
2001
Miles
Distance arrow
3220
Kilometers
Distance arrow
1739
Nautical miles

Search flights

Distance from Neiva to Jujuy

There are several ways to calculate the distance from Neiva to Jujuy. Here are two standard methods:

Vincenty's formula (applied above)
  • 2000.815 miles
  • 3220.000 kilometers
  • 1738.661 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2009.502 miles
  • 3233.980 kilometers
  • 1746.210 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Neiva to Jujuy?

The estimated flight time from Benito Salas Airport to Gobernador Horacio Guzmán International Airport is 4 hours and 17 minutes.

Flight carbon footprint between Benito Salas Airport (NVA) and Gobernador Horacio Guzmán International Airport (JUJ)

On average, flying from Neiva to Jujuy generates about 218 kg of CO2 per passenger, and 218 kilograms equals 480 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Neiva to Jujuy

See the map of the shortest flight path between Benito Salas Airport (NVA) and Gobernador Horacio Guzmán International Airport (JUJ).

Airport information

Origin Benito Salas Airport
City: Neiva
Country: Colombia Flag of Colombia
IATA Code: NVA
ICAO Code: SKNV
Coordinates: 2°57′0″N, 75°17′38″W
Destination Gobernador Horacio Guzmán International Airport
City: Jujuy
Country: Argentina Flag of Argentina
IATA Code: JUJ
ICAO Code: SASJ
Coordinates: 24°23′34″S, 65°5′52″W