Air Miles Calculator logo

How far is Nanchong from Bhuj?

The distance between Bhuj (Bhuj Airport) and Nanchong (Nanchong Gaoping Airport) is 2292 miles / 3688 kilometers / 1991 nautical miles.

The driving distance from Bhuj (BHJ) to Nanchong (NAO) is 3225 miles / 5190 kilometers, and travel time by car is about 62 hours 24 minutes.

Bhuj Airport – Nanchong Gaoping Airport

Distance arrow
2292
Miles
Distance arrow
3688
Kilometers
Distance arrow
1991
Nautical miles
Flight time duration
4 h 50 min
Time Difference
2 h 30 min
CO2 emission
251 kg

Search flights

Distance from Bhuj to Nanchong

There are several ways to calculate the distance from Bhuj to Nanchong. Here are two standard methods:

Vincenty's formula (applied above)
  • 2291.524 miles
  • 3687.851 kilometers
  • 1991.280 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2287.997 miles
  • 3682.175 kilometers
  • 1988.215 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhuj to Nanchong?

The estimated flight time from Bhuj Airport to Nanchong Gaoping Airport is 4 hours and 50 minutes.

Flight carbon footprint between Bhuj Airport (BHJ) and Nanchong Gaoping Airport (NAO)

On average, flying from Bhuj to Nanchong generates about 251 kg of CO2 per passenger, and 251 kilograms equals 553 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhuj to Nanchong

See the map of the shortest flight path between Bhuj Airport (BHJ) and Nanchong Gaoping Airport (NAO).

Airport information

Origin Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E
Destination Nanchong Gaoping Airport
City: Nanchong
Country: China Flag of China
IATA Code: NAO
ICAO Code: ZUNC
Coordinates: 30°45′14″N, 106°3′43″E