Air Miles Calculator logo

How far is Patras from Tenerife?

The distance between Tenerife (Tenerife South Airport) and Patras (Patras Araxos Airport) is 2293 miles / 3691 kilometers / 1993 nautical miles.

The driving distance from Tenerife (TFS) to Patras (GPA) is 3131 miles / 5039 kilometers, and travel time by car is about 80 hours 17 minutes.

Tenerife South Airport – Patras Araxos Airport

Distance arrow
2293
Miles
Distance arrow
3691
Kilometers
Distance arrow
1993
Nautical miles

Search flights

Distance from Tenerife to Patras

There are several ways to calculate the distance from Tenerife to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 2293.394 miles
  • 3690.859 kilometers
  • 1992.905 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2289.547 miles
  • 3684.668 kilometers
  • 1989.562 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tenerife to Patras?

The estimated flight time from Tenerife South Airport to Patras Araxos Airport is 4 hours and 50 minutes.

Flight carbon footprint between Tenerife South Airport (TFS) and Patras Araxos Airport (GPA)

On average, flying from Tenerife to Patras generates about 251 kg of CO2 per passenger, and 251 kilograms equals 554 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tenerife to Patras

See the map of the shortest flight path between Tenerife South Airport (TFS) and Patras Araxos Airport (GPA).

Airport information

Origin Tenerife South Airport
City: Tenerife
Country: Spain Flag of Spain
IATA Code: TFS
ICAO Code: GCTS
Coordinates: 28°2′40″N, 16°34′21″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E