Air Miles Calculator logo

How far is San Antonio de Palé from Smara?

The distance between Smara (Smara Airport) and San Antonio de Palé (Annobón Airport) is 2253 miles / 3626 kilometers / 1958 nautical miles.

Smara Airport – Annobón Airport

Distance arrow
2253
Miles
Distance arrow
3626
Kilometers
Distance arrow
1958
Nautical miles

Search flights

Distance from Smara to San Antonio de Palé

There are several ways to calculate the distance from Smara to San Antonio de Palé. Here are two standard methods:

Vincenty's formula (applied above)
  • 2253.031 miles
  • 3625.902 kilometers
  • 1957.831 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2260.474 miles
  • 3637.880 kilometers
  • 1964.298 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Smara to San Antonio de Palé?

The estimated flight time from Smara Airport to Annobón Airport is 4 hours and 45 minutes.

What is the time difference between Smara and San Antonio de Palé?

There is no time difference between Smara and San Antonio de Palé.

Flight carbon footprint between Smara Airport (SMW) and Annobón Airport (NBN)

On average, flying from Smara to San Antonio de Palé generates about 247 kg of CO2 per passenger, and 247 kilograms equals 543 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Smara to San Antonio de Palé

See the map of the shortest flight path between Smara Airport (SMW) and Annobón Airport (NBN).

Airport information

Origin Smara Airport
City: Smara
Country: Western Sahara Flag of Western Sahara
IATA Code: SMW
ICAO Code: GMMA
Coordinates: 26°43′54″N, 11°41′4″W
Destination Annobón Airport
City: San Antonio de Palé
Country: Equatorial Guinea Flag of Equatorial Guinea
IATA Code: NBN
ICAO Code: FGAB
Coordinates: 1°24′36″S, 5°37′18″E