How far is Burlington, IA, from Johannesburg?
The distance between Johannesburg (Lanseria International Airport) and Burlington (Southeast Iowa Regional Airport) is 8846 miles / 14237 kilometers / 7687 nautical miles.
Lanseria International Airport – Southeast Iowa Regional Airport
Search flights
Distance from Johannesburg to Burlington
There are several ways to calculate the distance from Johannesburg to Burlington. Here are two standard methods:
Vincenty's formula (applied above)- 8846.224 miles
- 14236.617 kilometers
- 7687.158 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8847.475 miles
- 14238.631 kilometers
- 7688.246 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Johannesburg to Burlington?
The estimated flight time from Lanseria International Airport to Southeast Iowa Regional Airport is 17 hours and 14 minutes.
What is the time difference between Johannesburg and Burlington?
Flight carbon footprint between Lanseria International Airport (HLA) and Southeast Iowa Regional Airport (BRL)
On average, flying from Johannesburg to Burlington generates about 1 124 kg of CO2 per passenger, and 1 124 kilograms equals 2 478 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Johannesburg to Burlington
See the map of the shortest flight path between Lanseria International Airport (HLA) and Southeast Iowa Regional Airport (BRL).
Airport information
Origin | Lanseria International Airport |
---|---|
City: | Johannesburg |
Country: | South Africa |
IATA Code: | HLA |
ICAO Code: | FALA |
Coordinates: | 25°56′18″S, 27°55′33″E |
Destination | Southeast Iowa Regional Airport |
---|---|
City: | Burlington, IA |
Country: | United States |
IATA Code: | BRL |
ICAO Code: | KBRL |
Coordinates: | 40°46′59″N, 91°7′31″W |