Air Miles Calculator logo

How far is Lightning Ridge from Barcelona?

The distance between Barcelona (Barcelona–El Prat Airport) and Lightning Ridge (Lightning Ridge Airport) is 10367 miles / 16684 kilometers / 9009 nautical miles.

Barcelona–El Prat Airport – Lightning Ridge Airport

Distance arrow
10367
Miles
Distance arrow
16684
Kilometers
Distance arrow
9009
Nautical miles
Flight time duration
20 h 7 min
CO2 emission
1 360 kg

Search flights

Distance from Barcelona to Lightning Ridge

There are several ways to calculate the distance from Barcelona to Lightning Ridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 10367.141 miles
  • 16684.296 kilometers
  • 9008.799 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10366.027 miles
  • 16682.503 kilometers
  • 9007.831 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barcelona to Lightning Ridge?

The estimated flight time from Barcelona–El Prat Airport to Lightning Ridge Airport is 20 hours and 7 minutes.

Flight carbon footprint between Barcelona–El Prat Airport (BCN) and Lightning Ridge Airport (LHG)

On average, flying from Barcelona to Lightning Ridge generates about 1 360 kg of CO2 per passenger, and 1 360 kilograms equals 2 997 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Barcelona to Lightning Ridge

See the map of the shortest flight path between Barcelona–El Prat Airport (BCN) and Lightning Ridge Airport (LHG).

Airport information

Origin Barcelona–El Prat Airport
City: Barcelona
Country: Spain Flag of Spain
IATA Code: BCN
ICAO Code: LEBL
Coordinates: 41°17′49″N, 2°4′42″E
Destination Lightning Ridge Airport
City: Lightning Ridge
Country: Australia Flag of Australia
IATA Code: LHG
ICAO Code: YLRD
Coordinates: 29°27′24″S, 147°59′2″E