Air Miles Calculator logo

How far is Luxor from Buraidah?

The distance between Buraidah (Prince Naif bin Abdulaziz International Airport) and Luxor (Luxor International Airport) is 690 miles / 1110 kilometers / 599 nautical miles.

Prince Naif bin Abdulaziz International Airport – Luxor International Airport

Distance arrow
690
Miles
Distance arrow
1110
Kilometers
Distance arrow
599
Nautical miles

Search flights

Distance from Buraidah to Luxor

There are several ways to calculate the distance from Buraidah to Luxor. Here are two standard methods:

Vincenty's formula (applied above)
  • 689.767 miles
  • 1110.073 kilometers
  • 599.392 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 688.569 miles
  • 1108.144 kilometers
  • 598.350 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Buraidah to Luxor?

The estimated flight time from Prince Naif bin Abdulaziz International Airport to Luxor International Airport is 1 hour and 48 minutes.

Flight carbon footprint between Prince Naif bin Abdulaziz International Airport (ELQ) and Luxor International Airport (LXR)

On average, flying from Buraidah to Luxor generates about 123 kg of CO2 per passenger, and 123 kilograms equals 272 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Buraidah to Luxor

See the map of the shortest flight path between Prince Naif bin Abdulaziz International Airport (ELQ) and Luxor International Airport (LXR).

Airport information

Origin Prince Naif bin Abdulaziz International Airport
City: Buraidah
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: ELQ
ICAO Code: OEGS
Coordinates: 26°18′10″N, 43°46′27″E
Destination Luxor International Airport
City: Luxor
Country: Egypt Flag of Egypt
IATA Code: LXR
ICAO Code: HELX
Coordinates: 25°40′15″N, 32°42′23″E