Air Miles Calculator logo

How far is Bhuj from Manzhouli?

The distance between Manzhouli (Manzhouli Xijiao Airport) and Bhuj (Bhuj Airport) is 3139 miles / 5052 kilometers / 2728 nautical miles.

The driving distance from Manzhouli (NZH) to Bhuj (BHJ) is 4854 miles / 7811 kilometers, and travel time by car is about 92 hours 46 minutes.

Manzhouli Xijiao Airport – Bhuj Airport

Distance arrow
3139
Miles
Distance arrow
5052
Kilometers
Distance arrow
2728
Nautical miles
Flight time duration
6 h 26 min
Time Difference
2 h 30 min
CO2 emission
351 kg

Search flights

Distance from Manzhouli to Bhuj

There are several ways to calculate the distance from Manzhouli to Bhuj. Here are two standard methods:

Vincenty's formula (applied above)
  • 3139.425 miles
  • 5052.416 kilometers
  • 2728.086 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3136.779 miles
  • 5048.156 kilometers
  • 2725.786 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manzhouli to Bhuj?

The estimated flight time from Manzhouli Xijiao Airport to Bhuj Airport is 6 hours and 26 minutes.

Flight carbon footprint between Manzhouli Xijiao Airport (NZH) and Bhuj Airport (BHJ)

On average, flying from Manzhouli to Bhuj generates about 351 kg of CO2 per passenger, and 351 kilograms equals 774 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Manzhouli to Bhuj

See the map of the shortest flight path between Manzhouli Xijiao Airport (NZH) and Bhuj Airport (BHJ).

Airport information

Origin Manzhouli Xijiao Airport
City: Manzhouli
Country: China Flag of China
IATA Code: NZH
ICAO Code: ZBMZ
Coordinates: 49°34′0″N, 117°19′48″E
Destination Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E