Air Miles Calculator logo

How far is Amakusa from Baghdad?

The distance between Baghdad (Baghdad International Airport) and Amakusa (Amakusa Airfield) is 4835 miles / 7782 kilometers / 4202 nautical miles.

The driving distance from Baghdad (BGW) to Amakusa (AXJ) is 6057 miles / 9747 kilometers, and travel time by car is about 119 hours 35 minutes.

Baghdad International Airport – Amakusa Airfield

Distance arrow
4835
Miles
Distance arrow
7782
Kilometers
Distance arrow
4202
Nautical miles

Search flights

Distance from Baghdad to Amakusa

There are several ways to calculate the distance from Baghdad to Amakusa. Here are two standard methods:

Vincenty's formula (applied above)
  • 4835.348 miles
  • 7781.738 kilometers
  • 4201.802 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4825.304 miles
  • 7765.573 kilometers
  • 4193.074 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baghdad to Amakusa?

The estimated flight time from Baghdad International Airport to Amakusa Airfield is 9 hours and 39 minutes.

Flight carbon footprint between Baghdad International Airport (BGW) and Amakusa Airfield (AXJ)

On average, flying from Baghdad to Amakusa generates about 562 kg of CO2 per passenger, and 562 kilograms equals 1 240 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baghdad to Amakusa

See the map of the shortest flight path between Baghdad International Airport (BGW) and Amakusa Airfield (AXJ).

Airport information

Origin Baghdad International Airport
City: Baghdad
Country: Iraq Flag of Iraq
IATA Code: BGW
ICAO Code: ORBI
Coordinates: 33°15′45″N, 44°14′4″E
Destination Amakusa Airfield
City: Amakusa
Country: Japan Flag of Japan
IATA Code: AXJ
ICAO Code: RJDA
Coordinates: 32°28′56″N, 130°9′32″E