Air Miles Calculator logo

How far is Mengnai from Hat Yai?

The distance between Hat Yai (Hat Yai International Airport) and Mengnai (Huatugou Airport) is 2234 miles / 3595 kilometers / 1941 nautical miles.

The driving distance from Hat Yai (HDY) to Mengnai (HTT) is 3261 miles / 5248 kilometers, and travel time by car is about 63 hours 59 minutes.

Hat Yai International Airport – Huatugou Airport

Distance arrow
2234
Miles
Distance arrow
3595
Kilometers
Distance arrow
1941
Nautical miles

Search flights

Distance from Hat Yai to Mengnai

There are several ways to calculate the distance from Hat Yai to Mengnai. Here are two standard methods:

Vincenty's formula (applied above)
  • 2233.651 miles
  • 3594.714 kilometers
  • 1940.990 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2241.585 miles
  • 3607.481 kilometers
  • 1947.884 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hat Yai to Mengnai?

The estimated flight time from Hat Yai International Airport to Huatugou Airport is 4 hours and 43 minutes.

Flight carbon footprint between Hat Yai International Airport (HDY) and Huatugou Airport (HTT)

On average, flying from Hat Yai to Mengnai generates about 244 kg of CO2 per passenger, and 244 kilograms equals 539 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hat Yai to Mengnai

See the map of the shortest flight path between Hat Yai International Airport (HDY) and Huatugou Airport (HTT).

Airport information

Origin Hat Yai International Airport
City: Hat Yai
Country: Thailand Flag of Thailand
IATA Code: HDY
ICAO Code: VTSS
Coordinates: 6°55′59″N, 100°23′34″E
Destination Huatugou Airport
City: Mengnai
Country: China Flag of China
IATA Code: HTT
ICAO Code: ZLHX
Coordinates: 38°12′7″N, 90°50′29″E