Air Miles Calculator logo

How far is Lightning Ridge from Guangzhou?

The distance between Guangzhou (Guangzhou Baiyun International Airport) and Lightning Ridge (Lightning Ridge Airport) is 4304 miles / 6927 kilometers / 3740 nautical miles.

Guangzhou Baiyun International Airport – Lightning Ridge Airport

Distance arrow
4304
Miles
Distance arrow
6927
Kilometers
Distance arrow
3740
Nautical miles

Search flights

Distance from Guangzhou to Lightning Ridge

There are several ways to calculate the distance from Guangzhou to Lightning Ridge. Here are two standard methods:

Vincenty's formula (applied above)
  • 4304.266 miles
  • 6927.045 kilometers
  • 3740.305 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4317.663 miles
  • 6948.605 kilometers
  • 3751.946 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guangzhou to Lightning Ridge?

The estimated flight time from Guangzhou Baiyun International Airport to Lightning Ridge Airport is 8 hours and 38 minutes.

Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Lightning Ridge Airport (LHG)

On average, flying from Guangzhou to Lightning Ridge generates about 495 kg of CO2 per passenger, and 495 kilograms equals 1 090 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guangzhou to Lightning Ridge

See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Lightning Ridge Airport (LHG).

Airport information

Origin Guangzhou Baiyun International Airport
City: Guangzhou
Country: China Flag of China
IATA Code: CAN
ICAO Code: ZGGG
Coordinates: 23°23′32″N, 113°17′56″E
Destination Lightning Ridge Airport
City: Lightning Ridge
Country: Australia Flag of Australia
IATA Code: LHG
ICAO Code: YLRD
Coordinates: 29°27′24″S, 147°59′2″E