Air Miles Calculator logo

How far is Palanga from Guangzhou?

The distance between Guangzhou (Guangzhou Baiyun International Airport) and Palanga (Palanga International Airport) is 4982 miles / 8017 kilometers / 4329 nautical miles.

Guangzhou Baiyun International Airport – Palanga International Airport

Distance arrow
4982
Miles
Distance arrow
8017
Kilometers
Distance arrow
4329
Nautical miles

Search flights

Distance from Guangzhou to Palanga

There are several ways to calculate the distance from Guangzhou to Palanga. Here are two standard methods:

Vincenty's formula (applied above)
  • 4981.583 miles
  • 8017.080 kilometers
  • 4328.877 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4973.612 miles
  • 8004.253 kilometers
  • 4321.951 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guangzhou to Palanga?

The estimated flight time from Guangzhou Baiyun International Airport to Palanga International Airport is 9 hours and 55 minutes.

Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Palanga International Airport (PLQ)

On average, flying from Guangzhou to Palanga generates about 581 kg of CO2 per passenger, and 581 kilograms equals 1 282 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guangzhou to Palanga

See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Palanga International Airport (PLQ).

Airport information

Origin Guangzhou Baiyun International Airport
City: Guangzhou
Country: China Flag of China
IATA Code: CAN
ICAO Code: ZGGG
Coordinates: 23°23′32″N, 113°17′56″E
Destination Palanga International Airport
City: Palanga
Country: Lithuania Flag of Lithuania
IATA Code: PLQ
ICAO Code: EYPA
Coordinates: 55°58′23″N, 21°5′38″E