Air Miles Calculator logo

How far is Lightning Ridge from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Lightning Ridge (Lightning Ridge Airport) is 6935 miles / 11161 kilometers / 6026 nautical miles.

Lanseria International Airport – Lightning Ridge Airport

Distance arrow
6935
Miles
Distance arrow
11161
Kilometers
Distance arrow
6026
Nautical miles

Search flights

Distance from Johannesburg to Lightning Ridge

There are several ways to calculate the distance from Johannesburg to Lightning Ridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 6935.079 miles
  • 11160.928 kilometers
  • 6026.419 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6923.185 miles
  • 11141.785 kilometers
  • 6016.083 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Lightning Ridge?

The estimated flight time from Lanseria International Airport to Lightning Ridge Airport is 13 hours and 37 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Lightning Ridge Airport (LHG)

On average, flying from Johannesburg to Lightning Ridge generates about 846 kg of CO2 per passenger, and 846 kilograms equals 1 864 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Lightning Ridge

See the map of the shortest flight path between Lanseria International Airport (HLA) and Lightning Ridge Airport (LHG).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Lightning Ridge Airport
City: Lightning Ridge
Country: Australia Flag of Australia
IATA Code: LHG
ICAO Code: YLRD
Coordinates: 29°27′24″S, 147°59′2″E