Air Miles Calculator logo

How far is Barrancabermeja from Itaituba?

The distance between Itaituba (Itaituba Airport) and Barrancabermeja (Yariguíes Airport) is 1453 miles / 2338 kilometers / 1262 nautical miles.

The driving distance from Itaituba (ITB) to Barrancabermeja (EJA) is 3102 miles / 4992 kilometers, and travel time by car is about 98 hours 25 minutes.

Itaituba Airport – Yariguíes Airport

Distance arrow
1453
Miles
Distance arrow
2338
Kilometers
Distance arrow
1262
Nautical miles

Search flights

Distance from Itaituba to Barrancabermeja

There are several ways to calculate the distance from Itaituba to Barrancabermeja. Here are two standard methods:

Vincenty's formula (applied above)
  • 1452.758 miles
  • 2337.988 kilometers
  • 1262.413 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1453.889 miles
  • 2339.808 kilometers
  • 1263.395 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Itaituba to Barrancabermeja?

The estimated flight time from Itaituba Airport to Yariguíes Airport is 3 hours and 15 minutes.

Flight carbon footprint between Itaituba Airport (ITB) and Yariguíes Airport (EJA)

On average, flying from Itaituba to Barrancabermeja generates about 177 kg of CO2 per passenger, and 177 kilograms equals 389 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Itaituba to Barrancabermeja

See the map of the shortest flight path between Itaituba Airport (ITB) and Yariguíes Airport (EJA).

Airport information

Origin Itaituba Airport
City: Itaituba
Country: Brazil Flag of Brazil
IATA Code: ITB
ICAO Code: SBIH
Coordinates: 4°14′32″S, 56°0′2″W
Destination Yariguíes Airport
City: Barrancabermeja
Country: Colombia Flag of Colombia
IATA Code: EJA
ICAO Code: SKEJ
Coordinates: 7°1′27″N, 73°48′24″W