How far is Patras from Palanga?
The distance between Palanga (Palanga International Airport) and Patras (Patras Araxos Airport) is 1231 miles / 1981 kilometers / 1070 nautical miles.
The driving distance from Palanga (PLQ) to Patras (GPA) is 1858 miles / 2990 kilometers, and travel time by car is about 31 hours 33 minutes.
Palanga International Airport – Patras Araxos Airport
Search flights
Distance from Palanga to Patras
There are several ways to calculate the distance from Palanga to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 1231.225 miles
- 1981.464 kilometers
- 1069.905 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1231.485 miles
- 1981.884 kilometers
- 1070.131 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Palanga to Patras?
The estimated flight time from Palanga International Airport to Patras Araxos Airport is 2 hours and 49 minutes.
What is the time difference between Palanga and Patras?
Flight carbon footprint between Palanga International Airport (PLQ) and Patras Araxos Airport (GPA)
On average, flying from Palanga to Patras generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Palanga to Patras
See the map of the shortest flight path between Palanga International Airport (PLQ) and Patras Araxos Airport (GPA).
Airport information
Origin | Palanga International Airport |
---|---|
City: | Palanga |
Country: | Lithuania ![]() |
IATA Code: | PLQ |
ICAO Code: | EYPA |
Coordinates: | 55°58′23″N, 21°5′38″E |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece ![]() |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |