Air Miles Calculator logo

How far is Agartala from Bangda?

The distance between Bangda (Qamdo Bamda Airport) and Agartala (Agartala Airport) is 584 miles / 940 kilometers / 507 nautical miles.

The driving distance from Bangda (BPX) to Agartala (IXA) is 1255 miles / 2019 kilometers, and travel time by car is about 25 hours 24 minutes.

Qamdo Bamda Airport – Agartala Airport

Distance arrow
584
Miles
Distance arrow
940
Kilometers
Distance arrow
507
Nautical miles
Flight time duration
1 h 36 min
CO2 emission
111 kg

Search flights

Distance from Bangda to Agartala

There are several ways to calculate the distance from Bangda to Agartala. Here are two standard methods:

Vincenty's formula (applied above)
  • 583.835 miles
  • 939.592 kilometers
  • 507.339 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 584.693 miles
  • 940.973 kilometers
  • 508.085 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangda to Agartala?

The estimated flight time from Qamdo Bamda Airport to Agartala Airport is 1 hour and 36 minutes.

Flight carbon footprint between Qamdo Bamda Airport (BPX) and Agartala Airport (IXA)

On average, flying from Bangda to Agartala generates about 111 kg of CO2 per passenger, and 111 kilograms equals 244 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bangda to Agartala

See the map of the shortest flight path between Qamdo Bamda Airport (BPX) and Agartala Airport (IXA).

Airport information

Origin Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E
Destination Agartala Airport
City: Agartala
Country: India Flag of India
IATA Code: IXA
ICAO Code: VEAT
Coordinates: 23°53′13″N, 91°14′25″E