Air Miles Calculator logo

How far is Jönköping from Lanzarote?

The distance between Lanzarote (Lanzarote Airport) and Jönköping (Jönköping Airport) is 2394 miles / 3852 kilometers / 2080 nautical miles.

The driving distance from Lanzarote (ACE) to Jönköping (JKG) is 3025 miles / 4869 kilometers, and travel time by car is about 73 hours 43 minutes.

Lanzarote Airport – Jönköping Airport

Distance arrow
2394
Miles
Distance arrow
3852
Kilometers
Distance arrow
2080
Nautical miles

Search flights

Distance from Lanzarote to Jönköping

There are several ways to calculate the distance from Lanzarote to Jönköping. Here are two standard methods:

Vincenty's formula (applied above)
  • 2393.531 miles
  • 3852.014 kilometers
  • 2079.921 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2392.984 miles
  • 3851.134 kilometers
  • 2079.446 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lanzarote to Jönköping?

The estimated flight time from Lanzarote Airport to Jönköping Airport is 5 hours and 1 minutes.

Flight carbon footprint between Lanzarote Airport (ACE) and Jönköping Airport (JKG)

On average, flying from Lanzarote to Jönköping generates about 263 kg of CO2 per passenger, and 263 kilograms equals 579 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lanzarote to Jönköping

See the map of the shortest flight path between Lanzarote Airport (ACE) and Jönköping Airport (JKG).

Airport information

Origin Lanzarote Airport
City: Lanzarote
Country: Spain Flag of Spain
IATA Code: ACE
ICAO Code: GCRR
Coordinates: 28°56′43″N, 13°36′18″W
Destination Jönköping Airport
City: Jönköping
Country: Sweden Flag of Sweden
IATA Code: JKG
ICAO Code: ESGJ
Coordinates: 57°45′27″N, 14°4′7″E