Air Miles Calculator logo

How far is Jujuy from Freetown?

The distance between Freetown (Lungi International Airport) and Jujuy (Gobernador Horacio Guzmán International Airport) is 4172 miles / 6714 kilometers / 3625 nautical miles.

Lungi International Airport – Gobernador Horacio Guzmán International Airport

Distance arrow
4172
Miles
Distance arrow
6714
Kilometers
Distance arrow
3625
Nautical miles

Search flights

Distance from Freetown to Jujuy

There are several ways to calculate the distance from Freetown to Jujuy. Here are two standard methods:

Vincenty's formula (applied above)
  • 4171.614 miles
  • 6713.561 kilometers
  • 3625.033 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4174.290 miles
  • 6717.869 kilometers
  • 3627.359 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Freetown to Jujuy?

The estimated flight time from Lungi International Airport to Gobernador Horacio Guzmán International Airport is 8 hours and 23 minutes.

Flight carbon footprint between Lungi International Airport (FNA) and Gobernador Horacio Guzmán International Airport (JUJ)

On average, flying from Freetown to Jujuy generates about 478 kg of CO2 per passenger, and 478 kilograms equals 1 053 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Freetown to Jujuy

See the map of the shortest flight path between Lungi International Airport (FNA) and Gobernador Horacio Guzmán International Airport (JUJ).

Airport information

Origin Lungi International Airport
City: Freetown
Country: Sierra Leone Flag of Sierra Leone
IATA Code: FNA
ICAO Code: GFLL
Coordinates: 8°36′59″N, 13°11′43″W
Destination Gobernador Horacio Guzmán International Airport
City: Jujuy
Country: Argentina Flag of Argentina
IATA Code: JUJ
ICAO Code: SASJ
Coordinates: 24°23′34″S, 65°5′52″W