Air Miles Calculator logo

How far is Wuzhou from Dawei?

The distance between Dawei (Dawei Airport) and Wuzhou (Wuzhou Changzhoudao Airport) is 1068 miles / 1719 kilometers / 928 nautical miles.

The driving distance from Dawei (TVY) to Wuzhou (WUZ) is 1501 miles / 2415 kilometers, and travel time by car is about 31 hours 19 minutes.

Dawei Airport – Wuzhou Changzhoudao Airport

Distance arrow
1068
Miles
Distance arrow
1719
Kilometers
Distance arrow
928
Nautical miles
Flight time duration
2 h 31 min
Time Difference
1 h 30 min
CO2 emission
155 kg

Search flights

Distance from Dawei to Wuzhou

There are several ways to calculate the distance from Dawei to Wuzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 1068.441 miles
  • 1719.489 kilometers
  • 928.450 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1069.207 miles
  • 1720.722 kilometers
  • 929.116 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Dawei to Wuzhou?

The estimated flight time from Dawei Airport to Wuzhou Changzhoudao Airport is 2 hours and 31 minutes.

Flight carbon footprint between Dawei Airport (TVY) and Wuzhou Changzhoudao Airport (WUZ)

On average, flying from Dawei to Wuzhou generates about 155 kg of CO2 per passenger, and 155 kilograms equals 342 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Dawei to Wuzhou

See the map of the shortest flight path between Dawei Airport (TVY) and Wuzhou Changzhoudao Airport (WUZ).

Airport information

Origin Dawei Airport
City: Dawei
Country: Burma Flag of Burma
IATA Code: TVY
ICAO Code: VYDW
Coordinates: 14°6′14″N, 98°12′12″E
Destination Wuzhou Changzhoudao Airport
City: Wuzhou
Country: China Flag of China
IATA Code: WUZ
ICAO Code: ZGWZ
Coordinates: 23°27′24″N, 111°14′52″E