Air Miles Calculator logo

How far is Jerez de la Frontera from Zanzibar?

The distance between Zanzibar (Abeid Amani Karume International Airport) and Jerez de la Frontera (Jerez Airport) is 4159 miles / 6693 kilometers / 3614 nautical miles.

Abeid Amani Karume International Airport – Jerez Airport

Distance arrow
4159
Miles
Distance arrow
6693
Kilometers
Distance arrow
3614
Nautical miles

Search flights

Distance from Zanzibar to Jerez de la Frontera

There are several ways to calculate the distance from Zanzibar to Jerez de la Frontera. Here are two standard methods:

Vincenty's formula (applied above)
  • 4159.056 miles
  • 6693.351 kilometers
  • 3614.121 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4165.569 miles
  • 6703.834 kilometers
  • 3619.781 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Zanzibar to Jerez de la Frontera?

The estimated flight time from Abeid Amani Karume International Airport to Jerez Airport is 8 hours and 22 minutes.

Flight carbon footprint between Abeid Amani Karume International Airport (ZNZ) and Jerez Airport (XRY)

On average, flying from Zanzibar to Jerez de la Frontera generates about 476 kg of CO2 per passenger, and 476 kilograms equals 1 050 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Zanzibar to Jerez de la Frontera

See the map of the shortest flight path between Abeid Amani Karume International Airport (ZNZ) and Jerez Airport (XRY).

Airport information

Origin Abeid Amani Karume International Airport
City: Zanzibar
Country: Tanzania Flag of Tanzania
IATA Code: ZNZ
ICAO Code: HTZA
Coordinates: 6°13′19″S, 39°13′29″E
Destination Jerez Airport
City: Jerez de la Frontera
Country: Spain Flag of Spain
IATA Code: XRY
ICAO Code: LEJR
Coordinates: 36°44′40″N, 6°3′36″W