Air Miles Calculator logo

How far is Huaihua from Dawei?

The distance between Dawei (Dawei Airport) and Huaihua (Huaihua Zhijiang Airport) is 1180 miles / 1898 kilometers / 1025 nautical miles.

The driving distance from Dawei (TVY) to Huaihua (HJJ) is 1680 miles / 2703 kilometers, and travel time by car is about 33 hours 10 minutes.

Dawei Airport – Huaihua Zhijiang Airport

Distance arrow
1180
Miles
Distance arrow
1898
Kilometers
Distance arrow
1025
Nautical miles
Flight time duration
2 h 44 min
Time Difference
1 h 30 min
CO2 emission
161 kg

Search flights

Distance from Dawei to Huaihua

There are several ways to calculate the distance from Dawei to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 1179.573 miles
  • 1898.338 kilometers
  • 1025.021 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1181.934 miles
  • 1902.138 kilometers
  • 1027.073 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Dawei to Huaihua?

The estimated flight time from Dawei Airport to Huaihua Zhijiang Airport is 2 hours and 44 minutes.

Flight carbon footprint between Dawei Airport (TVY) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Dawei to Huaihua generates about 161 kg of CO2 per passenger, and 161 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Dawei to Huaihua

See the map of the shortest flight path between Dawei Airport (TVY) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Dawei Airport
City: Dawei
Country: Burma Flag of Burma
IATA Code: TVY
ICAO Code: VYDW
Coordinates: 14°6′14″N, 98°12′12″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E