Air Miles Calculator logo

How far is Jerez de la Frontera from La Isabela?

The distance between La Isabela (La Isabela International Airport) and Jerez de la Frontera (Jerez Airport) is 4038 miles / 6499 kilometers / 3509 nautical miles.

La Isabela International Airport – Jerez Airport

Distance arrow
4038
Miles
Distance arrow
6499
Kilometers
Distance arrow
3509
Nautical miles

Search flights

Distance from La Isabela to Jerez de la Frontera

There are several ways to calculate the distance from La Isabela to Jerez de la Frontera. Here are two standard methods:

Vincenty's formula (applied above)
  • 4038.459 miles
  • 6499.269 kilometers
  • 3509.325 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4033.084 miles
  • 6490.620 kilometers
  • 3504.654 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from La Isabela to Jerez de la Frontera?

The estimated flight time from La Isabela International Airport to Jerez Airport is 8 hours and 8 minutes.

Flight carbon footprint between La Isabela International Airport (JBQ) and Jerez Airport (XRY)

On average, flying from La Isabela to Jerez de la Frontera generates about 461 kg of CO2 per passenger, and 461 kilograms equals 1 017 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from La Isabela to Jerez de la Frontera

See the map of the shortest flight path between La Isabela International Airport (JBQ) and Jerez Airport (XRY).

Airport information

Origin La Isabela International Airport
City: La Isabela
Country: Dominican Republic Flag of Dominican Republic
IATA Code: JBQ
ICAO Code: MDJB
Coordinates: 18°34′21″N, 69°59′8″W
Destination Jerez Airport
City: Jerez de la Frontera
Country: Spain Flag of Spain
IATA Code: XRY
ICAO Code: LEJR
Coordinates: 36°44′40″N, 6°3′36″W