Air Miles Calculator logo

How far is Jujuy from Stockholm?

The distance between Stockholm (Stockholm Bromma Airport) and Jujuy (Gobernador Horacio Guzmán International Airport) is 7410 miles / 11926 kilometers / 6439 nautical miles.

Stockholm Bromma Airport – Gobernador Horacio Guzmán International Airport

Distance arrow
7410
Miles
Distance arrow
11926
Kilometers
Distance arrow
6439
Nautical miles

Search flights

Distance from Stockholm to Jujuy

There are several ways to calculate the distance from Stockholm to Jujuy. Here are two standard methods:

Vincenty's formula (applied above)
  • 7410.312 miles
  • 11925.742 kilometers
  • 6439.385 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7420.685 miles
  • 11942.434 kilometers
  • 6448.399 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Stockholm to Jujuy?

The estimated flight time from Stockholm Bromma Airport to Gobernador Horacio Guzmán International Airport is 14 hours and 31 minutes.

Flight carbon footprint between Stockholm Bromma Airport (BMA) and Gobernador Horacio Guzmán International Airport (JUJ)

On average, flying from Stockholm to Jujuy generates about 913 kg of CO2 per passenger, and 913 kilograms equals 2 013 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Stockholm to Jujuy

See the map of the shortest flight path between Stockholm Bromma Airport (BMA) and Gobernador Horacio Guzmán International Airport (JUJ).

Airport information

Origin Stockholm Bromma Airport
City: Stockholm
Country: Sweden Flag of Sweden
IATA Code: BMA
ICAO Code: ESSB
Coordinates: 59°21′15″N, 17°56′30″E
Destination Gobernador Horacio Guzmán International Airport
City: Jujuy
Country: Argentina Flag of Argentina
IATA Code: JUJ
ICAO Code: SASJ
Coordinates: 24°23′34″S, 65°5′52″W