Air Miles Calculator logo

How far is Wuxi from Hua Hin?

The distance between Hua Hin (Hua Hin Airport) and Wuxi (Sunan Shuofang International Airport) is 1839 miles / 2960 kilometers / 1598 nautical miles.

The driving distance from Hua Hin (HHQ) to Wuxi (WUX) is 2371 miles / 3815 kilometers, and travel time by car is about 45 hours 4 minutes.

Hua Hin Airport – Sunan Shuofang International Airport

Distance arrow
1839
Miles
Distance arrow
2960
Kilometers
Distance arrow
1598
Nautical miles

Search flights

Distance from Hua Hin to Wuxi

There are several ways to calculate the distance from Hua Hin to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1839.200 miles
  • 2959.905 kilometers
  • 1598.221 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1841.503 miles
  • 2963.612 kilometers
  • 1600.222 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hua Hin to Wuxi?

The estimated flight time from Hua Hin Airport to Sunan Shuofang International Airport is 3 hours and 58 minutes.

Flight carbon footprint between Hua Hin Airport (HHQ) and Sunan Shuofang International Airport (WUX)

On average, flying from Hua Hin to Wuxi generates about 203 kg of CO2 per passenger, and 203 kilograms equals 448 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hua Hin to Wuxi

See the map of the shortest flight path between Hua Hin Airport (HHQ) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Hua Hin Airport
City: Hua Hin
Country: Thailand Flag of Thailand
IATA Code: HHQ
ICAO Code: VTPH
Coordinates: 12°38′10″N, 99°57′5″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E