Air Miles Calculator logo

How far is Burlington, IA, from East London?

The distance between East London (East London Airport) and Burlington (Southeast Iowa Regional Airport) is 9087 miles / 14624 kilometers / 7896 nautical miles.

East London Airport – Southeast Iowa Regional Airport

Distance arrow
9087
Miles
Distance arrow
14624
Kilometers
Distance arrow
7896
Nautical miles
Flight time duration
17 h 42 min
CO2 emission
1 160 kg

Search flights

Distance from East London to Burlington

There are several ways to calculate the distance from East London to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 9086.938 miles
  • 14624.008 kilometers
  • 7896.333 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9089.757 miles
  • 14628.545 kilometers
  • 7898.783 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from East London to Burlington?

The estimated flight time from East London Airport to Southeast Iowa Regional Airport is 17 hours and 42 minutes.

Flight carbon footprint between East London Airport (ELS) and Southeast Iowa Regional Airport (BRL)

On average, flying from East London to Burlington generates about 1 160 kg of CO2 per passenger, and 1 160 kilograms equals 2 558 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from East London to Burlington

See the map of the shortest flight path between East London Airport (ELS) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin East London Airport
City: East London
Country: South Africa Flag of South Africa
IATA Code: ELS
ICAO Code: FAEL
Coordinates: 33°2′8″S, 27°49′33″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W