Air Miles Calculator logo

How far is Graciosa Island from Charlotte, NC?

The distance between Charlotte (Charlotte Douglas International Airport) and Graciosa Island (Graciosa Airport) is 2892 miles / 4654 kilometers / 2513 nautical miles.

Charlotte Douglas International Airport – Graciosa Airport

Distance arrow
2892
Miles
Distance arrow
4654
Kilometers
Distance arrow
2513
Nautical miles

Search flights

Distance from Charlotte to Graciosa Island

There are several ways to calculate the distance from Charlotte to Graciosa Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 2891.721 miles
  • 4653.773 kilometers
  • 2512.836 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2885.077 miles
  • 4643.081 kilometers
  • 2507.063 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlotte to Graciosa Island?

The estimated flight time from Charlotte Douglas International Airport to Graciosa Airport is 5 hours and 58 minutes.

Flight carbon footprint between Charlotte Douglas International Airport (CLT) and Graciosa Airport (GRW)

On average, flying from Charlotte to Graciosa Island generates about 321 kg of CO2 per passenger, and 321 kilograms equals 708 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Charlotte to Graciosa Island

See the map of the shortest flight path between Charlotte Douglas International Airport (CLT) and Graciosa Airport (GRW).

Airport information

Origin Charlotte Douglas International Airport
City: Charlotte, NC
Country: United States Flag of United States
IATA Code: CLT
ICAO Code: KCLT
Coordinates: 35°12′50″N, 80°56′35″W
Destination Graciosa Airport
City: Graciosa Island
Country: Portugal Flag of Portugal
IATA Code: GRW
ICAO Code: LPGR
Coordinates: 39°5′31″N, 28°1′47″W