Air Miles Calculator logo

How far is Patras from Corpus Christi, TX?

The distance between Corpus Christi (Corpus Christi International Airport) and Patras (Patras Araxos Airport) is 6424 miles / 10338 kilometers / 5582 nautical miles.

Corpus Christi International Airport – Patras Araxos Airport

Distance arrow
6424
Miles
Distance arrow
10338
Kilometers
Distance arrow
5582
Nautical miles

Search flights

Distance from Corpus Christi to Patras

There are several ways to calculate the distance from Corpus Christi to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 6423.895 miles
  • 10338.257 kilometers
  • 5582.212 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6411.416 miles
  • 10318.174 kilometers
  • 5571.368 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Corpus Christi to Patras?

The estimated flight time from Corpus Christi International Airport to Patras Araxos Airport is 12 hours and 39 minutes.

Flight carbon footprint between Corpus Christi International Airport (CRP) and Patras Araxos Airport (GPA)

On average, flying from Corpus Christi to Patras generates about 775 kg of CO2 per passenger, and 775 kilograms equals 1 708 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Corpus Christi to Patras

See the map of the shortest flight path between Corpus Christi International Airport (CRP) and Patras Araxos Airport (GPA).

Airport information

Origin Corpus Christi International Airport
City: Corpus Christi, TX
Country: United States Flag of United States
IATA Code: CRP
ICAO Code: KCRP
Coordinates: 27°46′13″N, 97°30′4″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E