Air Miles Calculator logo

How far is Jiayuguan from Phuket?

The distance between Phuket (Phuket International Airport) and Jiayuguan (Jiayuguan Airport) is 2185 miles / 3516 kilometers / 1899 nautical miles.

The driving distance from Phuket (HKT) to Jiayuguan (JGN) is 3057 miles / 4920 kilometers, and travel time by car is about 58 hours 31 minutes.

Phuket International Airport – Jiayuguan Airport

Distance arrow
2185
Miles
Distance arrow
3516
Kilometers
Distance arrow
1899
Nautical miles

Search flights

Distance from Phuket to Jiayuguan

There are several ways to calculate the distance from Phuket to Jiayuguan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2185.033 miles
  • 3516.470 kilometers
  • 1898.742 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2193.281 miles
  • 3529.744 kilometers
  • 1905.909 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phuket to Jiayuguan?

The estimated flight time from Phuket International Airport to Jiayuguan Airport is 4 hours and 38 minutes.

Flight carbon footprint between Phuket International Airport (HKT) and Jiayuguan Airport (JGN)

On average, flying from Phuket to Jiayuguan generates about 239 kg of CO2 per passenger, and 239 kilograms equals 526 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Phuket to Jiayuguan

See the map of the shortest flight path between Phuket International Airport (HKT) and Jiayuguan Airport (JGN).

Airport information

Origin Phuket International Airport
City: Phuket
Country: Thailand Flag of Thailand
IATA Code: HKT
ICAO Code: VTSP
Coordinates: 8°6′47″N, 98°19′0″E
Destination Jiayuguan Airport
City: Jiayuguan
Country: China Flag of China
IATA Code: JGN
ICAO Code: ZLJQ
Coordinates: 39°51′24″N, 98°20′29″E