Air Miles Calculator logo

How far is Palmar Sur from Barrancabermeja?

The distance between Barrancabermeja (Yariguíes Airport) and Palmar Sur (Palmar Sur Airport) is 675 miles / 1086 kilometers / 587 nautical miles.

Yariguíes Airport – Palmar Sur Airport

Distance arrow
675
Miles
Distance arrow
1086
Kilometers
Distance arrow
587
Nautical miles

Search flights

Distance from Barrancabermeja to Palmar Sur

There are several ways to calculate the distance from Barrancabermeja to Palmar Sur. Here are two standard methods:

Vincenty's formula (applied above)
  • 674.939 miles
  • 1086.209 kilometers
  • 586.506 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 674.313 miles
  • 1085.201 kilometers
  • 585.962 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barrancabermeja to Palmar Sur?

The estimated flight time from Yariguíes Airport to Palmar Sur Airport is 1 hour and 46 minutes.

Flight carbon footprint between Yariguíes Airport (EJA) and Palmar Sur Airport (PMZ)

On average, flying from Barrancabermeja to Palmar Sur generates about 122 kg of CO2 per passenger, and 122 kilograms equals 268 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Barrancabermeja to Palmar Sur

See the map of the shortest flight path between Yariguíes Airport (EJA) and Palmar Sur Airport (PMZ).

Airport information

Origin Yariguíes Airport
City: Barrancabermeja
Country: Colombia Flag of Colombia
IATA Code: EJA
ICAO Code: SKEJ
Coordinates: 7°1′27″N, 73°48′24″W
Destination Palmar Sur Airport
City: Palmar Sur
Country: Costa Rica Flag of Costa Rica
IATA Code: PMZ
ICAO Code: MRPM
Coordinates: 8°57′3″N, 83°28′6″W