Air Miles Calculator logo

How far is Huaihua from Dung Quat Bay?

The distance between Dung Quat Bay (Chu Lai Airport) and Huaihua (Huaihua Zhijiang Airport) is 831 miles / 1337 kilometers / 722 nautical miles.

The driving distance from Dung Quat Bay (VCL) to Huaihua (HJJ) is 1176 miles / 1893 kilometers, and travel time by car is about 22 hours 5 minutes.

Chu Lai Airport – Huaihua Zhijiang Airport

Distance arrow
831
Miles
Distance arrow
1337
Kilometers
Distance arrow
722
Nautical miles

Search flights

Distance from Dung Quat Bay to Huaihua

There are several ways to calculate the distance from Dung Quat Bay to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 830.680 miles
  • 1336.850 kilometers
  • 721.841 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 834.174 miles
  • 1342.473 kilometers
  • 724.877 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Dung Quat Bay to Huaihua?

The estimated flight time from Chu Lai Airport to Huaihua Zhijiang Airport is 2 hours and 4 minutes.

Flight carbon footprint between Chu Lai Airport (VCL) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Dung Quat Bay to Huaihua generates about 138 kg of CO2 per passenger, and 138 kilograms equals 303 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Dung Quat Bay to Huaihua

See the map of the shortest flight path between Chu Lai Airport (VCL) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Chu Lai Airport
City: Dung Quat Bay
Country: Vietnam Flag of Vietnam
IATA Code: VCL
ICAO Code: VVCA
Coordinates: 15°24′11″N, 108°42′21″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E