Air Miles Calculator logo

How far is Liepāja from Jerez de la Frontera?

The distance between Jerez de la Frontera (Jerez Airport) and Liepāja (Liepāja International Airport) is 1856 miles / 2987 kilometers / 1613 nautical miles.

The driving distance from Jerez de la Frontera (XRY) to Liepāja (LPX) is 2394 miles / 3852 kilometers, and travel time by car is about 41 hours 51 minutes.

Jerez Airport – Liepāja International Airport

Distance arrow
1856
Miles
Distance arrow
2987
Kilometers
Distance arrow
1613
Nautical miles

Search flights

Distance from Jerez de la Frontera to Liepāja

There are several ways to calculate the distance from Jerez de la Frontera to Liepāja. Here are two standard methods:

Vincenty's formula (applied above)
  • 1856.286 miles
  • 2987.402 kilometers
  • 1613.068 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1854.152 miles
  • 2983.968 kilometers
  • 1611.214 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jerez de la Frontera to Liepāja?

The estimated flight time from Jerez Airport to Liepāja International Airport is 4 hours and 0 minutes.

Flight carbon footprint between Jerez Airport (XRY) and Liepāja International Airport (LPX)

On average, flying from Jerez de la Frontera to Liepāja generates about 205 kg of CO2 per passenger, and 205 kilograms equals 451 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jerez de la Frontera to Liepāja

See the map of the shortest flight path between Jerez Airport (XRY) and Liepāja International Airport (LPX).

Airport information

Origin Jerez Airport
City: Jerez de la Frontera
Country: Spain Flag of Spain
IATA Code: XRY
ICAO Code: LEJR
Coordinates: 36°44′40″N, 6°3′36″W
Destination Liepāja International Airport
City: Liepāja
Country: Latvia Flag of Latvia
IATA Code: LPX
ICAO Code: EVLA
Coordinates: 56°31′3″N, 21°5′48″E