Air Miles Calculator logo

How far is Patras from Bandar Abbas?

The distance between Bandar Abbas (Bandar Abbas International Airport) and Patras (Patras Araxos Airport) is 2156 miles / 3470 kilometers / 1874 nautical miles.

Bandar Abbas International Airport – Patras Araxos Airport

Distance arrow
2156
Miles
Distance arrow
3470
Kilometers
Distance arrow
1874
Nautical miles
Flight time duration
4 h 34 min
Time Difference
1 h 30 min
CO2 emission
235 kg

Search flights

Distance from Bandar Abbas to Patras

There are several ways to calculate the distance from Bandar Abbas to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 2156.279 miles
  • 3470.194 kilometers
  • 1873.755 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2153.023 miles
  • 3464.955 kilometers
  • 1870.926 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bandar Abbas to Patras?

The estimated flight time from Bandar Abbas International Airport to Patras Araxos Airport is 4 hours and 34 minutes.

Flight carbon footprint between Bandar Abbas International Airport (BND) and Patras Araxos Airport (GPA)

On average, flying from Bandar Abbas to Patras generates about 235 kg of CO2 per passenger, and 235 kilograms equals 519 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bandar Abbas to Patras

See the map of the shortest flight path between Bandar Abbas International Airport (BND) and Patras Araxos Airport (GPA).

Airport information

Origin Bandar Abbas International Airport
City: Bandar Abbas
Country: Iran Flag of Iran
IATA Code: BND
ICAO Code: OIKB
Coordinates: 27°13′5″N, 56°22′40″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E