Air Miles Calculator logo

How far is Abuja from São Tomé?

The distance between São Tomé (São Tomé International Airport) and Abuja (Nnamdi Azikiwe International Airport) is 594 miles / 956 kilometers / 516 nautical miles.

São Tomé International Airport – Nnamdi Azikiwe International Airport

Distance arrow
594
Miles
Distance arrow
956
Kilometers
Distance arrow
516
Nautical miles

Search flights

Distance from São Tomé to Abuja

There are several ways to calculate the distance from São Tomé to Abuja. Here are two standard methods:

Vincenty's formula (applied above)
  • 594.116 miles
  • 956.138 kilometers
  • 516.273 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 597.384 miles
  • 961.397 kilometers
  • 519.113 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from São Tomé to Abuja?

The estimated flight time from São Tomé International Airport to Nnamdi Azikiwe International Airport is 1 hour and 37 minutes.

Flight carbon footprint between São Tomé International Airport (TMS) and Nnamdi Azikiwe International Airport (ABV)

On average, flying from São Tomé to Abuja generates about 112 kg of CO2 per passenger, and 112 kilograms equals 247 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from São Tomé to Abuja

See the map of the shortest flight path between São Tomé International Airport (TMS) and Nnamdi Azikiwe International Airport (ABV).

Airport information

Origin São Tomé International Airport
City: São Tomé
Country: São Tomé and Principe Flag of São Tomé and Principe
IATA Code: TMS
ICAO Code: FPST
Coordinates: 0°22′41″N, 6°42′43″E
Destination Nnamdi Azikiwe International Airport
City: Abuja
Country: Nigeria Flag of Nigeria
IATA Code: ABV
ICAO Code: DNAA
Coordinates: 9°0′24″N, 7°15′47″E