Air Miles Calculator logo

How far is Patras from Fagernes?

The distance between Fagernes (Fagernes Airport, Leirin) and Patras (Patras Araxos Airport) is 1665 miles / 2680 kilometers / 1447 nautical miles.

The driving distance from Fagernes (VDB) to Patras (GPA) is 2233 miles / 3593 kilometers, and travel time by car is about 40 hours 57 minutes.

Fagernes Airport, Leirin – Patras Araxos Airport

Distance arrow
1665
Miles
Distance arrow
2680
Kilometers
Distance arrow
1447
Nautical miles

Search flights

Distance from Fagernes to Patras

There are several ways to calculate the distance from Fagernes to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1665.252 miles
  • 2679.963 kilometers
  • 1447.064 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1664.435 miles
  • 2678.649 kilometers
  • 1446.355 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fagernes to Patras?

The estimated flight time from Fagernes Airport, Leirin to Patras Araxos Airport is 3 hours and 39 minutes.

Flight carbon footprint between Fagernes Airport, Leirin (VDB) and Patras Araxos Airport (GPA)

On average, flying from Fagernes to Patras generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fagernes to Patras

See the map of the shortest flight path between Fagernes Airport, Leirin (VDB) and Patras Araxos Airport (GPA).

Airport information

Origin Fagernes Airport, Leirin
City: Fagernes
Country: Norway Flag of Norway
IATA Code: VDB
ICAO Code: ENFG
Coordinates: 61°0′56″N, 9°17′17″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E