Air Miles Calculator logo

How far is Lightning Ridge from Shenzhen?

The distance between Shenzhen (Shenzhen Bao'an International Airport) and Lightning Ridge (Lightning Ridge Airport) is 4243 miles / 6829 kilometers / 3687 nautical miles.

Shenzhen Bao'an International Airport – Lightning Ridge Airport

Distance arrow
4243
Miles
Distance arrow
6829
Kilometers
Distance arrow
3687
Nautical miles

Search flights

Distance from Shenzhen to Lightning Ridge

There are several ways to calculate the distance from Shenzhen to Lightning Ridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 4243.067 miles
  • 6828.555 kilometers
  • 3687.125 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4256.313 miles
  • 6849.872 kilometers
  • 3698.635 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shenzhen to Lightning Ridge?

The estimated flight time from Shenzhen Bao'an International Airport to Lightning Ridge Airport is 8 hours and 32 minutes.

Flight carbon footprint between Shenzhen Bao'an International Airport (SZX) and Lightning Ridge Airport (LHG)

On average, flying from Shenzhen to Lightning Ridge generates about 487 kg of CO2 per passenger, and 487 kilograms equals 1 073 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shenzhen to Lightning Ridge

See the map of the shortest flight path between Shenzhen Bao'an International Airport (SZX) and Lightning Ridge Airport (LHG).

Airport information

Origin Shenzhen Bao'an International Airport
City: Shenzhen
Country: China Flag of China
IATA Code: SZX
ICAO Code: ZGSZ
Coordinates: 22°38′21″N, 113°48′39″E
Destination Lightning Ridge Airport
City: Lightning Ridge
Country: Australia Flag of Australia
IATA Code: LHG
ICAO Code: YLRD
Coordinates: 29°27′24″S, 147°59′2″E