Air Miles Calculator logo

How far is Patras from Arvidsjaur?

The distance between Arvidsjaur (Arvidsjaur Airport) and Patras (Patras Araxos Airport) is 1899 miles / 3056 kilometers / 1650 nautical miles.

The driving distance from Arvidsjaur (AJR) to Patras (GPA) is 2666 miles / 4291 kilometers, and travel time by car is about 49 hours 50 minutes.

Arvidsjaur Airport – Patras Araxos Airport

Distance arrow
1899
Miles
Distance arrow
3056
Kilometers
Distance arrow
1650
Nautical miles

Search flights

Distance from Arvidsjaur to Patras

There are several ways to calculate the distance from Arvidsjaur to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1898.939 miles
  • 3056.047 kilometers
  • 1650.133 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1897.821 miles
  • 3054.247 kilometers
  • 1649.162 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Arvidsjaur to Patras?

The estimated flight time from Arvidsjaur Airport to Patras Araxos Airport is 4 hours and 5 minutes.

Flight carbon footprint between Arvidsjaur Airport (AJR) and Patras Araxos Airport (GPA)

On average, flying from Arvidsjaur to Patras generates about 208 kg of CO2 per passenger, and 208 kilograms equals 459 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Arvidsjaur to Patras

See the map of the shortest flight path between Arvidsjaur Airport (AJR) and Patras Araxos Airport (GPA).

Airport information

Origin Arvidsjaur Airport
City: Arvidsjaur
Country: Sweden Flag of Sweden
IATA Code: AJR
ICAO Code: ESNX
Coordinates: 65°35′25″N, 19°16′54″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E