Air Miles Calculator logo

How far is Jerez de la Frontera from Annaba?

The distance between Annaba (Rabah Bitat Airport) and Jerez de la Frontera (Jerez Airport) is 769 miles / 1237 kilometers / 668 nautical miles.

Rabah Bitat Airport – Jerez Airport

Distance arrow
769
Miles
Distance arrow
1237
Kilometers
Distance arrow
668
Nautical miles

Search flights

Distance from Annaba to Jerez de la Frontera

There are several ways to calculate the distance from Annaba to Jerez de la Frontera. Here are two standard methods:

Vincenty's formula (applied above)
  • 768.613 miles
  • 1236.963 kilometers
  • 667.907 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 766.832 miles
  • 1234.097 kilometers
  • 666.359 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Annaba to Jerez de la Frontera?

The estimated flight time from Rabah Bitat Airport to Jerez Airport is 1 hour and 57 minutes.

What is the time difference between Annaba and Jerez de la Frontera?

There is no time difference between Annaba and Jerez de la Frontera.

Flight carbon footprint between Rabah Bitat Airport (AAE) and Jerez Airport (XRY)

On average, flying from Annaba to Jerez de la Frontera generates about 132 kg of CO2 per passenger, and 132 kilograms equals 291 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Annaba to Jerez de la Frontera

See the map of the shortest flight path between Rabah Bitat Airport (AAE) and Jerez Airport (XRY).

Airport information

Origin Rabah Bitat Airport
City: Annaba
Country: Algeria Flag of Algeria
IATA Code: AAE
ICAO Code: DABB
Coordinates: 36°49′19″N, 7°48′33″E
Destination Jerez Airport
City: Jerez de la Frontera
Country: Spain Flag of Spain
IATA Code: XRY
ICAO Code: LEJR
Coordinates: 36°44′40″N, 6°3′36″W