Air Miles Calculator logo

How far is Huaihua from Banmaw?

The distance between Banmaw (Bhamo Airport) and Huaihua (Huaihua Zhijiang Airport) is 805 miles / 1296 kilometers / 700 nautical miles.

The driving distance from Banmaw (BMO) to Huaihua (HJJ) is 1064 miles / 1712 kilometers, and travel time by car is about 20 hours 9 minutes.

Bhamo Airport – Huaihua Zhijiang Airport

Distance arrow
805
Miles
Distance arrow
1296
Kilometers
Distance arrow
700
Nautical miles
Flight time duration
2 h 1 min
Time Difference
1 h 30 min
CO2 emission
135 kg

Search flights

Distance from Banmaw to Huaihua

There are several ways to calculate the distance from Banmaw to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 805.409 miles
  • 1296.180 kilometers
  • 699.881 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 804.320 miles
  • 1294.427 kilometers
  • 698.935 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Banmaw to Huaihua?

The estimated flight time from Bhamo Airport to Huaihua Zhijiang Airport is 2 hours and 1 minutes.

Flight carbon footprint between Bhamo Airport (BMO) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Banmaw to Huaihua generates about 135 kg of CO2 per passenger, and 135 kilograms equals 298 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Banmaw to Huaihua

See the map of the shortest flight path between Bhamo Airport (BMO) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Bhamo Airport
City: Banmaw
Country: Burma Flag of Burma
IATA Code: BMO
ICAO Code: VYBM
Coordinates: 24°16′8″N, 97°14′46″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E