Air Miles Calculator logo

How far is Burlington, IA, from Sarajevo?

The distance between Sarajevo (Sarajevo International Airport) and Burlington (Southeast Iowa Regional Airport) is 5148 miles / 8285 kilometers / 4473 nautical miles.

Sarajevo International Airport – Southeast Iowa Regional Airport

Distance arrow
5148
Miles
Distance arrow
8285
Kilometers
Distance arrow
4473
Nautical miles

Search flights

Distance from Sarajevo to Burlington

There are several ways to calculate the distance from Sarajevo to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 5147.953 miles
  • 8284.827 kilometers
  • 4473.449 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5134.699 miles
  • 8263.497 kilometers
  • 4461.932 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sarajevo to Burlington?

The estimated flight time from Sarajevo International Airport to Southeast Iowa Regional Airport is 10 hours and 14 minutes.

Flight carbon footprint between Sarajevo International Airport (SJJ) and Southeast Iowa Regional Airport (BRL)

On average, flying from Sarajevo to Burlington generates about 603 kg of CO2 per passenger, and 603 kilograms equals 1 330 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sarajevo to Burlington

See the map of the shortest flight path between Sarajevo International Airport (SJJ) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Sarajevo International Airport
City: Sarajevo
Country: Bosnia and Herzegovina Flag of Bosnia and Herzegovina
IATA Code: SJJ
ICAO Code: LQSA
Coordinates: 43°49′28″N, 18°19′53″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W