How far is Patras from Marina Di Campo?
The distance between Marina Di Campo (Marina di Campo Airport) and Patras (Patras Araxos Airport) is 669 miles / 1077 kilometers / 581 nautical miles.
The driving distance from Marina Di Campo (EBA) to Patras (GPA) is 925 miles / 1488 kilometers, and travel time by car is about 24 hours 3 minutes.
Marina di Campo Airport – Patras Araxos Airport
Search flights
Distance from Marina Di Campo to Patras
There are several ways to calculate the distance from Marina Di Campo to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 669.101 miles
- 1076.814 kilometers
- 581.433 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 668.001 miles
- 1075.043 kilometers
- 580.477 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Marina Di Campo to Patras?
The estimated flight time from Marina di Campo Airport to Patras Araxos Airport is 1 hour and 46 minutes.
What is the time difference between Marina Di Campo and Patras?
Flight carbon footprint between Marina di Campo Airport (EBA) and Patras Araxos Airport (GPA)
On average, flying from Marina Di Campo to Patras generates about 121 kg of CO2 per passenger, and 121 kilograms equals 267 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Marina Di Campo to Patras
See the map of the shortest flight path between Marina di Campo Airport (EBA) and Patras Araxos Airport (GPA).
Airport information
Origin | Marina di Campo Airport |
---|---|
City: | Marina Di Campo |
Country: | Italy |
IATA Code: | EBA |
ICAO Code: | LIRJ |
Coordinates: | 42°45′37″N, 10°14′21″E |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |