How far is Palembang from Hat Yai?
The distance between Hat Yai (Hat Yai International Airport) and Palembang (Sultan Mahmud Badaruddin II International Airport) is 738 miles / 1188 kilometers / 641 nautical miles.
Hat Yai International Airport – Sultan Mahmud Badaruddin II International Airport
Search flights
Distance from Hat Yai to Palembang
There are several ways to calculate the distance from Hat Yai to Palembang. Here are two standard methods:
Vincenty's formula (applied above)- 738.078 miles
- 1187.821 kilometers
- 641.372 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 741.393 miles
- 1193.156 kilometers
- 644.253 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hat Yai to Palembang?
The estimated flight time from Hat Yai International Airport to Sultan Mahmud Badaruddin II International Airport is 1 hour and 53 minutes.
What is the time difference between Hat Yai and Palembang?
Flight carbon footprint between Hat Yai International Airport (HDY) and Sultan Mahmud Badaruddin II International Airport (PLM)
On average, flying from Hat Yai to Palembang generates about 129 kg of CO2 per passenger, and 129 kilograms equals 284 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hat Yai to Palembang
See the map of the shortest flight path between Hat Yai International Airport (HDY) and Sultan Mahmud Badaruddin II International Airport (PLM).
Airport information
Origin | Hat Yai International Airport |
---|---|
City: | Hat Yai |
Country: | Thailand |
IATA Code: | HDY |
ICAO Code: | VTSS |
Coordinates: | 6°55′59″N, 100°23′34″E |
Destination | Sultan Mahmud Badaruddin II International Airport |
---|---|
City: | Palembang |
Country: | Indonesia |
IATA Code: | PLM |
ICAO Code: | WIPP |
Coordinates: | 2°53′53″S, 104°41′59″E |