Air Miles Calculator logo

How far is Nanchang from Baghdad?

The distance between Baghdad (Baghdad International Airport) and Nanchang (Nanchang Changbei International Airport) is 4176 miles / 6721 kilometers / 3629 nautical miles.

The driving distance from Baghdad (BGW) to Nanchang (KHN) is 5230 miles / 8417 kilometers, and travel time by car is about 100 hours 38 minutes.

Baghdad International Airport – Nanchang Changbei International Airport

Distance arrow
4176
Miles
Distance arrow
6721
Kilometers
Distance arrow
3629
Nautical miles

Search flights

Distance from Baghdad to Nanchang

There are several ways to calculate the distance from Baghdad to Nanchang. Here are two standard methods:

Vincenty's formula (applied above)
  • 4176.079 miles
  • 6720.748 kilometers
  • 3628.914 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4167.850 miles
  • 6707.505 kilometers
  • 3621.763 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baghdad to Nanchang?

The estimated flight time from Baghdad International Airport to Nanchang Changbei International Airport is 8 hours and 24 minutes.

Flight carbon footprint between Baghdad International Airport (BGW) and Nanchang Changbei International Airport (KHN)

On average, flying from Baghdad to Nanchang generates about 478 kg of CO2 per passenger, and 478 kilograms equals 1 055 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baghdad to Nanchang

See the map of the shortest flight path between Baghdad International Airport (BGW) and Nanchang Changbei International Airport (KHN).

Airport information

Origin Baghdad International Airport
City: Baghdad
Country: Iraq Flag of Iraq
IATA Code: BGW
ICAO Code: ORBI
Coordinates: 33°15′45″N, 44°14′4″E
Destination Nanchang Changbei International Airport
City: Nanchang
Country: China Flag of China
IATA Code: KHN
ICAO Code: ZSCN
Coordinates: 28°51′53″N, 115°54′0″E