Air Miles Calculator logo

How far is Lightning Ridge from Southampton?

The distance between Southampton (Southampton Airport) and Lightning Ridge (Lightning Ridge Airport) is 10267 miles / 16523 kilometers / 8921 nautical miles.

Southampton Airport – Lightning Ridge Airport

Distance arrow
10267
Miles
Distance arrow
16523
Kilometers
Distance arrow
8921
Nautical miles
Flight time duration
19 h 56 min
CO2 emission
1 344 kg

Search flights

Distance from Southampton to Lightning Ridge

There are several ways to calculate the distance from Southampton to Lightning Ridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 10266.644 miles
  • 16522.561 kilometers
  • 8921.469 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10269.019 miles
  • 16526.384 kilometers
  • 8923.533 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Southampton to Lightning Ridge?

The estimated flight time from Southampton Airport to Lightning Ridge Airport is 19 hours and 56 minutes.

Flight carbon footprint between Southampton Airport (SOU) and Lightning Ridge Airport (LHG)

On average, flying from Southampton to Lightning Ridge generates about 1 344 kg of CO2 per passenger, and 1 344 kilograms equals 2 962 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Southampton to Lightning Ridge

See the map of the shortest flight path between Southampton Airport (SOU) and Lightning Ridge Airport (LHG).

Airport information

Origin Southampton Airport
City: Southampton
Country: United Kingdom Flag of United Kingdom
IATA Code: SOU
ICAO Code: EGHI
Coordinates: 50°57′1″N, 1°21′24″W
Destination Lightning Ridge Airport
City: Lightning Ridge
Country: Australia Flag of Australia
IATA Code: LHG
ICAO Code: YLRD
Coordinates: 29°27′24″S, 147°59′2″E