Air Miles Calculator logo

How far is Patras from Rio De Janeiro?

The distance between Rio De Janeiro (Rio de Janeiro–Galeão International Airport) and Patras (Patras Araxos Airport) is 5929 miles / 9543 kilometers / 5153 nautical miles.

Rio de Janeiro–Galeão International Airport – Patras Araxos Airport

Distance arrow
5929
Miles
Distance arrow
9543
Kilometers
Distance arrow
5153
Nautical miles

Search flights

Distance from Rio De Janeiro to Patras

There are several ways to calculate the distance from Rio De Janeiro to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 5929.479 miles
  • 9542.571 kilometers
  • 5152.576 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5938.782 miles
  • 9557.543 kilometers
  • 5160.660 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Rio De Janeiro to Patras?

The estimated flight time from Rio de Janeiro–Galeão International Airport to Patras Araxos Airport is 11 hours and 43 minutes.

Flight carbon footprint between Rio de Janeiro–Galeão International Airport (GIG) and Patras Araxos Airport (GPA)

On average, flying from Rio De Janeiro to Patras generates about 707 kg of CO2 per passenger, and 707 kilograms equals 1 559 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Rio De Janeiro to Patras

See the map of the shortest flight path between Rio de Janeiro–Galeão International Airport (GIG) and Patras Araxos Airport (GPA).

Airport information

Origin Rio de Janeiro–Galeão International Airport
City: Rio De Janeiro
Country: Brazil Flag of Brazil
IATA Code: GIG
ICAO Code: SBGL
Coordinates: 22°48′35″S, 43°15′2″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E