Air Miles Calculator logo

How far is Yonago from Baghdad?

The distance between Baghdad (Baghdad International Airport) and Yonago (Miho-Yonago Airport) is 4896 miles / 7880 kilometers / 4255 nautical miles.

The driving distance from Baghdad (BGW) to Yonago (YGJ) is 6194 miles / 9968 kilometers, and travel time by car is about 122 hours 30 minutes.

Baghdad International Airport – Miho-Yonago Airport

Distance arrow
4896
Miles
Distance arrow
7880
Kilometers
Distance arrow
4255
Nautical miles

Search flights

Distance from Baghdad to Yonago

There are several ways to calculate the distance from Baghdad to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 4896.310 miles
  • 7879.846 kilometers
  • 4254.777 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4885.780 miles
  • 7862.901 kilometers
  • 4245.627 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baghdad to Yonago?

The estimated flight time from Baghdad International Airport to Miho-Yonago Airport is 9 hours and 46 minutes.

Flight carbon footprint between Baghdad International Airport (BGW) and Miho-Yonago Airport (YGJ)

On average, flying from Baghdad to Yonago generates about 570 kg of CO2 per passenger, and 570 kilograms equals 1 257 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baghdad to Yonago

See the map of the shortest flight path between Baghdad International Airport (BGW) and Miho-Yonago Airport (YGJ).

Airport information

Origin Baghdad International Airport
City: Baghdad
Country: Iraq Flag of Iraq
IATA Code: BGW
ICAO Code: ORBI
Coordinates: 33°15′45″N, 44°14′4″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E