Air Miles Calculator logo

How far is Patras from Hammerfest?

The distance between Hammerfest (Hammerfest Airport) and Patras (Patras Araxos Airport) is 2251 miles / 3623 kilometers / 1956 nautical miles.

The driving distance from Hammerfest (HFT) to Patras (GPA) is 3136 miles / 5047 kilometers, and travel time by car is about 60 hours 40 minutes.

Hammerfest Airport – Patras Araxos Airport

Distance arrow
2251
Miles
Distance arrow
3623
Kilometers
Distance arrow
1956
Nautical miles

Search flights

Distance from Hammerfest to Patras

There are several ways to calculate the distance from Hammerfest to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 2251.176 miles
  • 3622.917 kilometers
  • 1956.219 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2248.980 miles
  • 3619.382 kilometers
  • 1954.310 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hammerfest to Patras?

The estimated flight time from Hammerfest Airport to Patras Araxos Airport is 4 hours and 45 minutes.

Flight carbon footprint between Hammerfest Airport (HFT) and Patras Araxos Airport (GPA)

On average, flying from Hammerfest to Patras generates about 246 kg of CO2 per passenger, and 246 kilograms equals 543 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hammerfest to Patras

See the map of the shortest flight path between Hammerfest Airport (HFT) and Patras Araxos Airport (GPA).

Airport information

Origin Hammerfest Airport
City: Hammerfest
Country: Norway Flag of Norway
IATA Code: HFT
ICAO Code: ENHF
Coordinates: 70°40′46″N, 23°40′6″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E