Air Miles Calculator logo

How far is Ioannina from Bayda?

The distance between Bayda (Al Abraq International Airport) and Ioannina (Ioannina National Airport) is 481 miles / 773 kilometers / 418 nautical miles.

Al Abraq International Airport – Ioannina National Airport

Distance arrow
481
Miles
Distance arrow
773
Kilometers
Distance arrow
418
Nautical miles

Search flights

Distance from Bayda to Ioannina

There are several ways to calculate the distance from Bayda to Ioannina. Here are two standard methods:

Vincenty's formula (applied above)
  • 480.521 miles
  • 773.324 kilometers
  • 417.562 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 481.486 miles
  • 774.876 kilometers
  • 418.400 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayda to Ioannina?

The estimated flight time from Al Abraq International Airport to Ioannina National Airport is 1 hour and 24 minutes.

What is the time difference between Bayda and Ioannina?

There is no time difference between Bayda and Ioannina.

Flight carbon footprint between Al Abraq International Airport (LAQ) and Ioannina National Airport (IOA)

On average, flying from Bayda to Ioannina generates about 96 kg of CO2 per passenger, and 96 kilograms equals 211 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bayda to Ioannina

See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Ioannina National Airport (IOA).

Airport information

Origin Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E
Destination Ioannina National Airport
City: Ioannina
Country: Greece Flag of Greece
IATA Code: IOA
ICAO Code: LGIO
Coordinates: 39°41′47″N, 20°49′21″E