Air Miles Calculator logo

How far is Cagliari from Buraidah?

The distance between Buraidah (Prince Naif bin Abdulaziz International Airport) and Cagliari (Cagliari Elmas Airport) is 2192 miles / 3528 kilometers / 1905 nautical miles.

Prince Naif bin Abdulaziz International Airport – Cagliari Elmas Airport

Distance arrow
2192
Miles
Distance arrow
3528
Kilometers
Distance arrow
1905
Nautical miles

Search flights

Distance from Buraidah to Cagliari

There are several ways to calculate the distance from Buraidah to Cagliari. Here are two standard methods:

Vincenty's formula (applied above)
  • 2191.974 miles
  • 3527.641 kilometers
  • 1904.774 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2189.105 miles
  • 3523.024 kilometers
  • 1902.281 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Buraidah to Cagliari?

The estimated flight time from Prince Naif bin Abdulaziz International Airport to Cagliari Elmas Airport is 4 hours and 39 minutes.

Flight carbon footprint between Prince Naif bin Abdulaziz International Airport (ELQ) and Cagliari Elmas Airport (CAG)

On average, flying from Buraidah to Cagliari generates about 239 kg of CO2 per passenger, and 239 kilograms equals 528 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Buraidah to Cagliari

See the map of the shortest flight path between Prince Naif bin Abdulaziz International Airport (ELQ) and Cagliari Elmas Airport (CAG).

Airport information

Origin Prince Naif bin Abdulaziz International Airport
City: Buraidah
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: ELQ
ICAO Code: OEGS
Coordinates: 26°18′10″N, 43°46′27″E
Destination Cagliari Elmas Airport
City: Cagliari
Country: Italy Flag of Italy
IATA Code: CAG
ICAO Code: LIEE
Coordinates: 39°15′5″N, 9°3′15″E