Air Miles Calculator logo

How far is Patras from Waterford?

The distance between Waterford (Waterford Airport) and Patras (Patras Araxos Airport) is 1679 miles / 2702 kilometers / 1459 nautical miles.

The driving distance from Waterford (WAT) to Patras (GPA) is 2324 miles / 3740 kilometers, and travel time by car is about 42 hours 59 minutes.

Waterford Airport – Patras Araxos Airport

Distance arrow
1679
Miles
Distance arrow
2702
Kilometers
Distance arrow
1459
Nautical miles

Search flights

Distance from Waterford to Patras

There are several ways to calculate the distance from Waterford to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1679.234 miles
  • 2702.465 kilometers
  • 1459.214 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1676.421 miles
  • 2697.938 kilometers
  • 1456.770 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Waterford to Patras?

The estimated flight time from Waterford Airport to Patras Araxos Airport is 3 hours and 40 minutes.

Flight carbon footprint between Waterford Airport (WAT) and Patras Araxos Airport (GPA)

On average, flying from Waterford to Patras generates about 191 kg of CO2 per passenger, and 191 kilograms equals 422 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Waterford to Patras

See the map of the shortest flight path between Waterford Airport (WAT) and Patras Araxos Airport (GPA).

Airport information

Origin Waterford Airport
City: Waterford
Country: Ireland Flag of Ireland
IATA Code: WAT
ICAO Code: EIWF
Coordinates: 52°11′13″N, 7°5′13″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E