Air Miles Calculator logo

How far is Pontes e Lacerda from Abuja?

The distance between Abuja (Nnamdi Azikiwe International Airport) and Pontes e Lacerda (Pontes e Lacerda Airport) is 4861 miles / 7822 kilometers / 4224 nautical miles.

Nnamdi Azikiwe International Airport – Pontes e Lacerda Airport

Distance arrow
4861
Miles
Distance arrow
7822
Kilometers
Distance arrow
4224
Nautical miles

Search flights

Distance from Abuja to Pontes e Lacerda

There are several ways to calculate the distance from Abuja to Pontes e Lacerda. Here are two standard methods:

Vincenty's formula (applied above)
  • 4860.536 miles
  • 7822.274 kilometers
  • 4223.690 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4858.596 miles
  • 7819.152 kilometers
  • 4222.004 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abuja to Pontes e Lacerda?

The estimated flight time from Nnamdi Azikiwe International Airport to Pontes e Lacerda Airport is 9 hours and 42 minutes.

Flight carbon footprint between Nnamdi Azikiwe International Airport (ABV) and Pontes e Lacerda Airport (LCB)

On average, flying from Abuja to Pontes e Lacerda generates about 566 kg of CO2 per passenger, and 566 kilograms equals 1 247 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Abuja to Pontes e Lacerda

See the map of the shortest flight path between Nnamdi Azikiwe International Airport (ABV) and Pontes e Lacerda Airport (LCB).

Airport information

Origin Nnamdi Azikiwe International Airport
City: Abuja
Country: Nigeria Flag of Nigeria
IATA Code: ABV
ICAO Code: DNAA
Coordinates: 9°0′24″N, 7°15′47″E
Destination Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W