Air Miles Calculator logo

How far is Bengkulu from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Bengkulu (Fatmawati Soekarno Airport) is 5138 miles / 8269 kilometers / 4465 nautical miles.

Lanseria International Airport – Fatmawati Soekarno Airport

Distance arrow
5138
Miles
Distance arrow
8269
Kilometers
Distance arrow
4465
Nautical miles

Search flights

Distance from Johannesburg to Bengkulu

There are several ways to calculate the distance from Johannesburg to Bengkulu. Here are two standard methods:

Vincenty's formula (applied above)
  • 5138.013 miles
  • 8268.830 kilometers
  • 4464.811 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5133.809 miles
  • 8262.065 kilometers
  • 4461.158 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Bengkulu?

The estimated flight time from Lanseria International Airport to Fatmawati Soekarno Airport is 10 hours and 13 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Fatmawati Soekarno Airport (BKS)

On average, flying from Johannesburg to Bengkulu generates about 602 kg of CO2 per passenger, and 602 kilograms equals 1 327 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Bengkulu

See the map of the shortest flight path between Lanseria International Airport (HLA) and Fatmawati Soekarno Airport (BKS).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Fatmawati Soekarno Airport
City: Bengkulu
Country: Indonesia Flag of Indonesia
IATA Code: BKS
ICAO Code: WIPL
Coordinates: 3°51′49″S, 102°20′20″E