Air Miles Calculator logo

How far is Pisa from Abuja?

The distance between Abuja (Nnamdi Azikiwe International Airport) and Pisa (Pisa International Airport) is 2395 miles / 3855 kilometers / 2081 nautical miles.

Nnamdi Azikiwe International Airport – Pisa International Airport

Distance arrow
2395
Miles
Distance arrow
3855
Kilometers
Distance arrow
2081
Nautical miles

Search flights

Distance from Abuja to Pisa

There are several ways to calculate the distance from Abuja to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 2395.196 miles
  • 3854.695 kilometers
  • 2081.369 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2403.362 miles
  • 3867.835 kilometers
  • 2088.464 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abuja to Pisa?

The estimated flight time from Nnamdi Azikiwe International Airport to Pisa International Airport is 5 hours and 2 minutes.

What is the time difference between Abuja and Pisa?

There is no time difference between Abuja and Pisa.

Flight carbon footprint between Nnamdi Azikiwe International Airport (ABV) and Pisa International Airport (PSA)

On average, flying from Abuja to Pisa generates about 263 kg of CO2 per passenger, and 263 kilograms equals 580 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Abuja to Pisa

See the map of the shortest flight path between Nnamdi Azikiwe International Airport (ABV) and Pisa International Airport (PSA).

Airport information

Origin Nnamdi Azikiwe International Airport
City: Abuja
Country: Nigeria Flag of Nigeria
IATA Code: ABV
ICAO Code: DNAA
Coordinates: 9°0′24″N, 7°15′47″E
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E