How far is Baghdad from Sana'a?
The distance between Sana'a (Sanaa International Airport) and Baghdad (Baghdad International Airport) is 1224 miles / 1970 kilometers / 1064 nautical miles.
The driving distance from Sana'a (SAH) to Baghdad (BGW) is 1726 miles / 2777 kilometers, and travel time by car is about 31 hours 47 minutes.
Sanaa International Airport – Baghdad International Airport
Search flights
Distance from Sana'a to Baghdad
There are several ways to calculate the distance from Sana'a to Baghdad. Here are two standard methods:
Vincenty's formula (applied above)- 1224.206 miles
- 1970.169 kilometers
- 1063.806 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1228.910 miles
- 1977.739 kilometers
- 1067.893 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sana'a to Baghdad?
The estimated flight time from Sanaa International Airport to Baghdad International Airport is 2 hours and 49 minutes.
What is the time difference between Sana'a and Baghdad?
Flight carbon footprint between Sanaa International Airport (SAH) and Baghdad International Airport (BGW)
On average, flying from Sana'a to Baghdad generates about 162 kg of CO2 per passenger, and 162 kilograms equals 358 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Sana'a to Baghdad
See the map of the shortest flight path between Sanaa International Airport (SAH) and Baghdad International Airport (BGW).
Airport information
Origin | Sanaa International Airport |
---|---|
City: | Sana'a |
Country: | Yemen |
IATA Code: | SAH |
ICAO Code: | OYSN |
Coordinates: | 15°28′34″N, 44°13′10″E |
Destination | Baghdad International Airport |
---|---|
City: | Baghdad |
Country: | Iraq |
IATA Code: | BGW |
ICAO Code: | ORBI |
Coordinates: | 33°15′45″N, 44°14′4″E |