Air Miles Calculator logo

How far is Palanga from Bayda?

The distance between Bayda (Al Abraq International Airport) and Palanga (Palanga International Airport) is 1601 miles / 2577 kilometers / 1392 nautical miles.

Al Abraq International Airport – Palanga International Airport

Distance arrow
1601
Miles
Distance arrow
2577
Kilometers
Distance arrow
1392
Nautical miles

Search flights

Distance from Bayda to Palanga

There are several ways to calculate the distance from Bayda to Palanga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1601.366 miles
  • 2577.149 kilometers
  • 1391.549 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1602.442 miles
  • 2578.881 kilometers
  • 1392.484 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayda to Palanga?

The estimated flight time from Al Abraq International Airport to Palanga International Airport is 3 hours and 31 minutes.

What is the time difference between Bayda and Palanga?

There is no time difference between Bayda and Palanga.

Flight carbon footprint between Al Abraq International Airport (LAQ) and Palanga International Airport (PLQ)

On average, flying from Bayda to Palanga generates about 186 kg of CO2 per passenger, and 186 kilograms equals 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bayda to Palanga

See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Palanga International Airport (PLQ).

Airport information

Origin Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E
Destination Palanga International Airport
City: Palanga
Country: Lithuania Flag of Lithuania
IATA Code: PLQ
ICAO Code: EYPA
Coordinates: 55°58′23″N, 21°5′38″E