Air Miles Calculator logo

How far is Patras from Baltimore, MD?

The distance between Baltimore (Baltimore–Washington International Airport) and Patras (Patras Araxos Airport) is 5006 miles / 8056 kilometers / 4350 nautical miles.

Baltimore–Washington International Airport – Patras Araxos Airport

Distance arrow
5006
Miles
Distance arrow
8056
Kilometers
Distance arrow
4350
Nautical miles

Search flights

Distance from Baltimore to Patras

There are several ways to calculate the distance from Baltimore to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 5005.897 miles
  • 8056.210 kilometers
  • 4350.005 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4993.967 miles
  • 8037.010 kilometers
  • 4339.638 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baltimore to Patras?

The estimated flight time from Baltimore–Washington International Airport to Patras Araxos Airport is 9 hours and 58 minutes.

Flight carbon footprint between Baltimore–Washington International Airport (BWI) and Patras Araxos Airport (GPA)

On average, flying from Baltimore to Patras generates about 585 kg of CO2 per passenger, and 585 kilograms equals 1 289 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Baltimore to Patras

See the map of the shortest flight path between Baltimore–Washington International Airport (BWI) and Patras Araxos Airport (GPA).

Airport information

Origin Baltimore–Washington International Airport
City: Baltimore, MD
Country: United States Flag of United States
IATA Code: BWI
ICAO Code: KBWI
Coordinates: 39°10′31″N, 76°40′5″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E