Air Miles Calculator logo

How far is Sibu from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Sibu (Sibu Airport) is 2082 miles / 3350 kilometers / 1809 nautical miles.

Shanghai Hongqiao International Airport – Sibu Airport

Distance arrow
2082
Miles
Distance arrow
3350
Kilometers
Distance arrow
1809
Nautical miles

Search flights

Distance from Shanghai to Sibu

There are several ways to calculate the distance from Shanghai to Sibu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2081.856 miles
  • 3350.423 kilometers
  • 1809.084 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2090.353 miles
  • 3364.097 kilometers
  • 1816.467 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Sibu?

The estimated flight time from Shanghai Hongqiao International Airport to Sibu Airport is 4 hours and 26 minutes.

What is the time difference between Shanghai and Sibu?

There is no time difference between Shanghai and Sibu.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Sibu Airport (SBW)

On average, flying from Shanghai to Sibu generates about 227 kg of CO2 per passenger, and 227 kilograms equals 500 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shanghai to Sibu

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Sibu Airport (SBW).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Sibu Airport
City: Sibu
Country: Malaysia Flag of Malaysia
IATA Code: SBW
ICAO Code: WBGS
Coordinates: 2°15′41″N, 111°59′6″E