Air Miles Calculator logo

How far is Bar Harbor, ME, from Santa Cruz das Flores?

The distance between Santa Cruz das Flores (Flores Airport) and Bar Harbor (Hancock County–Bar Harbor Airport) is 1932 miles / 3109 kilometers / 1678 nautical miles.

Flores Airport – Hancock County–Bar Harbor Airport

Distance arrow
1932
Miles
Distance arrow
3109
Kilometers
Distance arrow
1678
Nautical miles

Search flights

Distance from Santa Cruz das Flores to Bar Harbor

There are several ways to calculate the distance from Santa Cruz das Flores to Bar Harbor. Here are two standard methods:

Vincenty's formula (applied above)
  • 1931.574 miles
  • 3108.567 kilometers
  • 1678.492 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1926.760 miles
  • 3100.820 kilometers
  • 1674.309 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santa Cruz das Flores to Bar Harbor?

The estimated flight time from Flores Airport to Hancock County–Bar Harbor Airport is 4 hours and 9 minutes.

Flight carbon footprint between Flores Airport (FLW) and Hancock County–Bar Harbor Airport (BHB)

On average, flying from Santa Cruz das Flores to Bar Harbor generates about 211 kg of CO2 per passenger, and 211 kilograms equals 466 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Santa Cruz das Flores to Bar Harbor

See the map of the shortest flight path between Flores Airport (FLW) and Hancock County–Bar Harbor Airport (BHB).

Airport information

Origin Flores Airport
City: Santa Cruz das Flores
Country: Portugal Flag of Portugal
IATA Code: FLW
ICAO Code: LPFL
Coordinates: 39°27′19″N, 31°7′53″W
Destination Hancock County–Bar Harbor Airport
City: Bar Harbor, ME
Country: United States Flag of United States
IATA Code: BHB
ICAO Code: KBHB
Coordinates: 44°27′0″N, 68°21′41″W