Air Miles Calculator logo

How far is Yonago from Phuket?

The distance between Phuket (Phuket International Airport) and Yonago (Miho-Yonago Airport) is 2901 miles / 4669 kilometers / 2521 nautical miles.

The driving distance from Phuket (HKT) to Yonago (YGJ) is 4463 miles / 7182 kilometers, and travel time by car is about 88 hours 6 minutes.

Phuket International Airport – Miho-Yonago Airport

Distance arrow
2901
Miles
Distance arrow
4669
Kilometers
Distance arrow
2521
Nautical miles

Search flights

Distance from Phuket to Yonago

There are several ways to calculate the distance from Phuket to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 2901.268 miles
  • 4669.137 kilometers
  • 2521.133 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2903.581 miles
  • 4672.861 kilometers
  • 2523.143 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phuket to Yonago?

The estimated flight time from Phuket International Airport to Miho-Yonago Airport is 5 hours and 59 minutes.

Flight carbon footprint between Phuket International Airport (HKT) and Miho-Yonago Airport (YGJ)

On average, flying from Phuket to Yonago generates about 322 kg of CO2 per passenger, and 322 kilograms equals 711 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Phuket to Yonago

See the map of the shortest flight path between Phuket International Airport (HKT) and Miho-Yonago Airport (YGJ).

Airport information

Origin Phuket International Airport
City: Phuket
Country: Thailand Flag of Thailand
IATA Code: HKT
ICAO Code: VTSP
Coordinates: 8°6′47″N, 98°19′0″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E