Air Miles Calculator logo

How far is Barrancabermeja from Palmas?

The distance between Palmas (Palmas Airport) and Barrancabermeja (Yariguíes Airport) is 2119 miles / 3410 kilometers / 1841 nautical miles.

The driving distance from Palmas (PMW) to Barrancabermeja (EJA) is 5129 miles / 8254 kilometers, and travel time by car is about 117 hours 27 minutes.

Palmas Airport – Yariguíes Airport

Distance arrow
2119
Miles
Distance arrow
3410
Kilometers
Distance arrow
1841
Nautical miles

Search flights

Distance from Palmas to Barrancabermeja

There are several ways to calculate the distance from Palmas to Barrancabermeja. Here are two standard methods:

Vincenty's formula (applied above)
  • 2118.610 miles
  • 3409.573 kilometers
  • 1841.022 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2120.654 miles
  • 3412.862 kilometers
  • 1842.798 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Palmas to Barrancabermeja?

The estimated flight time from Palmas Airport to Yariguíes Airport is 4 hours and 30 minutes.

Flight carbon footprint between Palmas Airport (PMW) and Yariguíes Airport (EJA)

On average, flying from Palmas to Barrancabermeja generates about 231 kg of CO2 per passenger, and 231 kilograms equals 509 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Palmas to Barrancabermeja

See the map of the shortest flight path between Palmas Airport (PMW) and Yariguíes Airport (EJA).

Airport information

Origin Palmas Airport
City: Palmas
Country: Brazil Flag of Brazil
IATA Code: PMW
ICAO Code: SBPJ
Coordinates: 10°17′29″S, 48°21′25″W
Destination Yariguíes Airport
City: Barrancabermeja
Country: Colombia Flag of Colombia
IATA Code: EJA
ICAO Code: SKEJ
Coordinates: 7°1′27″N, 73°48′24″W