Air Miles Calculator logo

How far is Liepāja from Guangzhou?

The distance between Guangzhou (Guangzhou Baiyun International Airport) and Liepāja (Liepāja International Airport) is 4972 miles / 8001 kilometers / 4320 nautical miles.

Guangzhou Baiyun International Airport – Liepāja International Airport

Distance arrow
4972
Miles
Distance arrow
8001
Kilometers
Distance arrow
4320
Nautical miles

Search flights

Distance from Guangzhou to Liepāja

There are several ways to calculate the distance from Guangzhou to Liepāja. Here are two standard methods:

Vincenty's formula (applied above)
  • 4971.514 miles
  • 8000.877 kilometers
  • 4320.128 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4963.613 miles
  • 7988.161 kilometers
  • 4313.262 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guangzhou to Liepāja?

The estimated flight time from Guangzhou Baiyun International Airport to Liepāja International Airport is 9 hours and 54 minutes.

Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Liepāja International Airport (LPX)

On average, flying from Guangzhou to Liepāja generates about 580 kg of CO2 per passenger, and 580 kilograms equals 1 279 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guangzhou to Liepāja

See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Liepāja International Airport (LPX).

Airport information

Origin Guangzhou Baiyun International Airport
City: Guangzhou
Country: China Flag of China
IATA Code: CAN
ICAO Code: ZGGG
Coordinates: 23°23′32″N, 113°17′56″E
Destination Liepāja International Airport
City: Liepāja
Country: Latvia Flag of Latvia
IATA Code: LPX
ICAO Code: EVLA
Coordinates: 56°31′3″N, 21°5′48″E