Air Miles Calculator logo

How far is Bhuj from Phuket?

The distance between Phuket (Phuket International Airport) and Bhuj (Bhuj Airport) is 2167 miles / 3488 kilometers / 1883 nautical miles.

The driving distance from Phuket (HKT) to Bhuj (BHJ) is 3582 miles / 5764 kilometers, and travel time by car is about 71 hours 38 minutes.

Phuket International Airport – Bhuj Airport

Distance arrow
2167
Miles
Distance arrow
3488
Kilometers
Distance arrow
1883
Nautical miles
Flight time duration
4 h 36 min
Time Difference
1 h 30 min
CO2 emission
237 kg

Search flights

Distance from Phuket to Bhuj

There are several ways to calculate the distance from Phuket to Bhuj. Here are two standard methods:

Vincenty's formula (applied above)
  • 2167.325 miles
  • 3487.971 kilometers
  • 1883.354 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2167.465 miles
  • 3488.197 kilometers
  • 1883.475 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phuket to Bhuj?

The estimated flight time from Phuket International Airport to Bhuj Airport is 4 hours and 36 minutes.

Flight carbon footprint between Phuket International Airport (HKT) and Bhuj Airport (BHJ)

On average, flying from Phuket to Bhuj generates about 237 kg of CO2 per passenger, and 237 kilograms equals 522 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Phuket to Bhuj

See the map of the shortest flight path between Phuket International Airport (HKT) and Bhuj Airport (BHJ).

Airport information

Origin Phuket International Airport
City: Phuket
Country: Thailand Flag of Thailand
IATA Code: HKT
ICAO Code: VTSP
Coordinates: 8°6′47″N, 98°19′0″E
Destination Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E