Air Miles Calculator logo

How far is Graciosa Island from Baltimore, MD?

The distance between Baltimore (Baltimore–Washington International Airport) and Graciosa Island (Graciosa Airport) is 2581 miles / 4153 kilometers / 2242 nautical miles.

Baltimore–Washington International Airport – Graciosa Airport

Distance arrow
2581
Miles
Distance arrow
4153
Kilometers
Distance arrow
2242
Nautical miles

Search flights

Distance from Baltimore to Graciosa Island

There are several ways to calculate the distance from Baltimore to Graciosa Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 2580.602 miles
  • 4153.077 kilometers
  • 2242.482 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2574.285 miles
  • 4142.910 kilometers
  • 2236.992 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baltimore to Graciosa Island?

The estimated flight time from Baltimore–Washington International Airport to Graciosa Airport is 5 hours and 23 minutes.

Flight carbon footprint between Baltimore–Washington International Airport (BWI) and Graciosa Airport (GRW)

On average, flying from Baltimore to Graciosa Island generates about 285 kg of CO2 per passenger, and 285 kilograms equals 627 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Baltimore to Graciosa Island

See the map of the shortest flight path between Baltimore–Washington International Airport (BWI) and Graciosa Airport (GRW).

Airport information

Origin Baltimore–Washington International Airport
City: Baltimore, MD
Country: United States Flag of United States
IATA Code: BWI
ICAO Code: KBWI
Coordinates: 39°10′31″N, 76°40′5″W
Destination Graciosa Airport
City: Graciosa Island
Country: Portugal Flag of Portugal
IATA Code: GRW
ICAO Code: LPGR
Coordinates: 39°5′31″N, 28°1′47″W