Air Miles Calculator logo

How far is Marina Di Campo from Maceió?

The distance between Maceió (Zumbi dos Palmares International Airport) and Marina Di Campo (Marina di Campo Airport) is 4621 miles / 7437 kilometers / 4016 nautical miles.

Zumbi dos Palmares International Airport – Marina di Campo Airport

Distance arrow
4621
Miles
Distance arrow
7437
Kilometers
Distance arrow
4016
Nautical miles

Search flights

Distance from Maceió to Marina Di Campo

There are several ways to calculate the distance from Maceió to Marina Di Campo. Here are two standard methods:

Vincenty's formula (applied above)
  • 4621.010 miles
  • 7436.795 kilometers
  • 4015.548 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4630.070 miles
  • 7451.375 kilometers
  • 4023.420 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Maceió to Marina Di Campo?

The estimated flight time from Zumbi dos Palmares International Airport to Marina di Campo Airport is 9 hours and 14 minutes.

Flight carbon footprint between Zumbi dos Palmares International Airport (MCZ) and Marina di Campo Airport (EBA)

On average, flying from Maceió to Marina Di Campo generates about 535 kg of CO2 per passenger, and 535 kilograms equals 1 179 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Maceió to Marina Di Campo

See the map of the shortest flight path between Zumbi dos Palmares International Airport (MCZ) and Marina di Campo Airport (EBA).

Airport information

Origin Zumbi dos Palmares International Airport
City: Maceió
Country: Brazil Flag of Brazil
IATA Code: MCZ
ICAO Code: SBMO
Coordinates: 9°30′38″S, 35°47′30″W
Destination Marina di Campo Airport
City: Marina Di Campo
Country: Italy Flag of Italy
IATA Code: EBA
ICAO Code: LIRJ
Coordinates: 42°45′37″N, 10°14′21″E