Air Miles Calculator logo

How far is Bangda from Jharsuguda?

The distance between Jharsuguda (Jharsuguda Airport) and Bangda (Qamdo Bamda Airport) is 1004 miles / 1616 kilometers / 873 nautical miles.

The driving distance from Jharsuguda (JRG) to Bangda (BPX) is 1609 miles / 2590 kilometers, and travel time by car is about 31 hours 3 minutes.

Jharsuguda Airport – Qamdo Bamda Airport

Distance arrow
1004
Miles
Distance arrow
1616
Kilometers
Distance arrow
873
Nautical miles
Flight time duration
2 h 24 min
CO2 emission
151 kg

Search flights

Distance from Jharsuguda to Bangda

There are several ways to calculate the distance from Jharsuguda to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 1004.172 miles
  • 1616.059 kilometers
  • 872.602 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1004.299 miles
  • 1616.263 kilometers
  • 872.712 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jharsuguda to Bangda?

The estimated flight time from Jharsuguda Airport to Qamdo Bamda Airport is 2 hours and 24 minutes.

Flight carbon footprint between Jharsuguda Airport (JRG) and Qamdo Bamda Airport (BPX)

On average, flying from Jharsuguda to Bangda generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jharsuguda to Bangda

See the map of the shortest flight path between Jharsuguda Airport (JRG) and Qamdo Bamda Airport (BPX).

Airport information

Origin Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E