Air Miles Calculator logo

How far is Minggang from Bhuj?

The distance between Bhuj (Bhuj Airport) and Minggang (Xinyang Minggang Airport) is 2770 miles / 4457 kilometers / 2407 nautical miles.

The driving distance from Bhuj (BHJ) to Minggang (XAI) is 3826 miles / 6158 kilometers, and travel time by car is about 73 hours 18 minutes.

Bhuj Airport – Xinyang Minggang Airport

Distance arrow
2770
Miles
Distance arrow
4457
Kilometers
Distance arrow
2407
Nautical miles
Flight time duration
5 h 44 min
Time Difference
2 h 30 min
CO2 emission
307 kg

Search flights

Distance from Bhuj to Minggang

There are several ways to calculate the distance from Bhuj to Minggang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2769.526 miles
  • 4457.120 kilometers
  • 2406.652 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2765.173 miles
  • 4450.114 kilometers
  • 2402.870 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhuj to Minggang?

The estimated flight time from Bhuj Airport to Xinyang Minggang Airport is 5 hours and 44 minutes.

Flight carbon footprint between Bhuj Airport (BHJ) and Xinyang Minggang Airport (XAI)

On average, flying from Bhuj to Minggang generates about 307 kg of CO2 per passenger, and 307 kilograms equals 676 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhuj to Minggang

See the map of the shortest flight path between Bhuj Airport (BHJ) and Xinyang Minggang Airport (XAI).

Airport information

Origin Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E
Destination Xinyang Minggang Airport
City: Minggang
Country: China Flag of China
IATA Code: XAI
ICAO Code: ZHXY
Coordinates: 32°32′26″N, 114°4′44″E