Air Miles Calculator logo

How far is Jerez de la Frontera from Umeå?

The distance between Umeå (Umeå Airport) and Jerez de la Frontera (Jerez Airport) is 2169 miles / 3491 kilometers / 1885 nautical miles.

The driving distance from Umeå (UME) to Jerez de la Frontera (XRY) is 2652 miles / 4268 kilometers, and travel time by car is about 47 hours 6 minutes.

Umeå Airport – Jerez Airport

Distance arrow
2169
Miles
Distance arrow
3491
Kilometers
Distance arrow
1885
Nautical miles

Search flights

Distance from Umeå to Jerez de la Frontera

There are several ways to calculate the distance from Umeå to Jerez de la Frontera. Here are two standard methods:

Vincenty's formula (applied above)
  • 2169.204 miles
  • 3490.996 kilometers
  • 1884.987 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2167.010 miles
  • 3487.465 kilometers
  • 1883.081 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Umeå to Jerez de la Frontera?

The estimated flight time from Umeå Airport to Jerez Airport is 4 hours and 36 minutes.

What is the time difference between Umeå and Jerez de la Frontera?

There is no time difference between Umeå and Jerez de la Frontera.

Flight carbon footprint between Umeå Airport (UME) and Jerez Airport (XRY)

On average, flying from Umeå to Jerez de la Frontera generates about 237 kg of CO2 per passenger, and 237 kilograms equals 522 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Umeå to Jerez de la Frontera

See the map of the shortest flight path between Umeå Airport (UME) and Jerez Airport (XRY).

Airport information

Origin Umeå Airport
City: Umeå
Country: Sweden Flag of Sweden
IATA Code: UME
ICAO Code: ESNU
Coordinates: 63°47′30″N, 20°16′58″E
Destination Jerez Airport
City: Jerez de la Frontera
Country: Spain Flag of Spain
IATA Code: XRY
ICAO Code: LEJR
Coordinates: 36°44′40″N, 6°3′36″W