Air Miles Calculator logo

How far is Patras from Andenes?

The distance between Andenes (Andøya Airport, Andenes) and Patras (Patras Araxos Airport) is 2163 miles / 3480 kilometers / 1879 nautical miles.

The driving distance from Andenes (ANX) to Patras (GPA) is 3021 miles / 4862 kilometers, and travel time by car is about 60 hours 16 minutes.

Andøya Airport, Andenes – Patras Araxos Airport

Distance arrow
2163
Miles
Distance arrow
3480
Kilometers
Distance arrow
1879
Nautical miles

Search flights

Distance from Andenes to Patras

There are several ways to calculate the distance from Andenes to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 2162.607 miles
  • 3480.379 kilometers
  • 1879.254 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2160.686 miles
  • 3477.286 kilometers
  • 1877.584 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Andenes to Patras?

The estimated flight time from Andøya Airport, Andenes to Patras Araxos Airport is 4 hours and 35 minutes.

Flight carbon footprint between Andøya Airport, Andenes (ANX) and Patras Araxos Airport (GPA)

On average, flying from Andenes to Patras generates about 236 kg of CO2 per passenger, and 236 kilograms equals 520 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Andenes to Patras

See the map of the shortest flight path between Andøya Airport, Andenes (ANX) and Patras Araxos Airport (GPA).

Airport information

Origin Andøya Airport, Andenes
City: Andenes
Country: Norway Flag of Norway
IATA Code: ANX
ICAO Code: ENAN
Coordinates: 69°17′33″N, 16°8′39″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E