Air Miles Calculator logo

How far is Patras from Stavanger?

The distance between Stavanger (Stavanger Airport, Sola) and Patras (Patras Araxos Airport) is 1596 miles / 2568 kilometers / 1387 nautical miles.

The driving distance from Stavanger (SVG) to Patras (GPA) is 2177 miles / 3504 kilometers, and travel time by car is about 38 hours 46 minutes.

Stavanger Airport, Sola – Patras Araxos Airport

Distance arrow
1596
Miles
Distance arrow
2568
Kilometers
Distance arrow
1387
Nautical miles

Search flights

Distance from Stavanger to Patras

There are several ways to calculate the distance from Stavanger to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1595.696 miles
  • 2568.024 kilometers
  • 1386.622 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1594.737 miles
  • 2566.481 kilometers
  • 1385.789 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Stavanger to Patras?

The estimated flight time from Stavanger Airport, Sola to Patras Araxos Airport is 3 hours and 31 minutes.

Flight carbon footprint between Stavanger Airport, Sola (SVG) and Patras Araxos Airport (GPA)

On average, flying from Stavanger to Patras generates about 186 kg of CO2 per passenger, and 186 kilograms equals 409 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Stavanger to Patras

See the map of the shortest flight path between Stavanger Airport, Sola (SVG) and Patras Araxos Airport (GPA).

Airport information

Origin Stavanger Airport, Sola
City: Stavanger
Country: Norway Flag of Norway
IATA Code: SVG
ICAO Code: ENZV
Coordinates: 58°52′36″N, 5°38′16″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E