Air Miles Calculator logo

How far is Shanghai from Bhuj?

The distance between Bhuj (Bhuj Airport) and Shanghai (Shanghai Hongqiao International Airport) is 3198 miles / 5146 kilometers / 2779 nautical miles.

The driving distance from Bhuj (BHJ) to Shanghai (SHA) is 4279 miles / 6886 kilometers, and travel time by car is about 81 hours 17 minutes.

Bhuj Airport – Shanghai Hongqiao International Airport

Distance arrow
3198
Miles
Distance arrow
5146
Kilometers
Distance arrow
2779
Nautical miles
Flight time duration
6 h 33 min
Time Difference
2 h 30 min
CO2 emission
358 kg

Search flights

Distance from Bhuj to Shanghai

There are several ways to calculate the distance from Bhuj to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 3197.584 miles
  • 5146.013 kilometers
  • 2778.625 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3192.269 miles
  • 5137.460 kilometers
  • 2774.006 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhuj to Shanghai?

The estimated flight time from Bhuj Airport to Shanghai Hongqiao International Airport is 6 hours and 33 minutes.

Flight carbon footprint between Bhuj Airport (BHJ) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Bhuj to Shanghai generates about 358 kg of CO2 per passenger, and 358 kilograms equals 789 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhuj to Shanghai

See the map of the shortest flight path between Bhuj Airport (BHJ) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E