Air Miles Calculator logo

How far is Bangda from Khajuraho?

The distance between Khajuraho (Khajuraho Airport) and Bangda (Qamdo Bamda Airport) is 1124 miles / 1808 kilometers / 976 nautical miles.

The driving distance from Khajuraho (HJR) to Bangda (BPX) is 1629 miles / 2621 kilometers, and travel time by car is about 31 hours 52 minutes.

Khajuraho Airport – Qamdo Bamda Airport

Distance arrow
1124
Miles
Distance arrow
1808
Kilometers
Distance arrow
976
Nautical miles
Flight time duration
2 h 37 min
CO2 emission
158 kg

Search flights

Distance from Khajuraho to Bangda

There are several ways to calculate the distance from Khajuraho to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 1123.702 miles
  • 1808.423 kilometers
  • 976.470 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1122.368 miles
  • 1806.276 kilometers
  • 975.311 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Khajuraho to Bangda?

The estimated flight time from Khajuraho Airport to Qamdo Bamda Airport is 2 hours and 37 minutes.

Flight carbon footprint between Khajuraho Airport (HJR) and Qamdo Bamda Airport (BPX)

On average, flying from Khajuraho to Bangda generates about 158 kg of CO2 per passenger, and 158 kilograms equals 348 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Khajuraho to Bangda

See the map of the shortest flight path between Khajuraho Airport (HJR) and Qamdo Bamda Airport (BPX).

Airport information

Origin Khajuraho Airport
City: Khajuraho
Country: India Flag of India
IATA Code: HJR
ICAO Code: VAKJ
Coordinates: 24°49′1″N, 79°55′6″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E