Air Miles Calculator logo

How far is Shanghai from Hubli?

The distance between Hubli (Hubli Airport) and Shanghai (Shanghai Hongqiao International Airport) is 3109 miles / 5004 kilometers / 2702 nautical miles.

The driving distance from Hubli (HBX) to Shanghai (SHA) is 4252 miles / 6843 kilometers, and travel time by car is about 79 hours 52 minutes.

Hubli Airport – Shanghai Hongqiao International Airport

Distance arrow
3109
Miles
Distance arrow
5004
Kilometers
Distance arrow
2702
Nautical miles
Flight time duration
6 h 23 min
Time Difference
2 h 30 min
CO2 emission
347 kg

Search flights

Distance from Hubli to Shanghai

There are several ways to calculate the distance from Hubli to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 3109.471 miles
  • 5004.209 kilometers
  • 2702.057 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3106.513 miles
  • 4999.448 kilometers
  • 2699.486 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hubli to Shanghai?

The estimated flight time from Hubli Airport to Shanghai Hongqiao International Airport is 6 hours and 23 minutes.

Flight carbon footprint between Hubli Airport (HBX) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Hubli to Shanghai generates about 347 kg of CO2 per passenger, and 347 kilograms equals 766 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hubli to Shanghai

See the map of the shortest flight path between Hubli Airport (HBX) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Hubli Airport
City: Hubli
Country: India Flag of India
IATA Code: HBX
ICAO Code: VAHB
Coordinates: 15°21′42″N, 75°5′5″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E