Air Miles Calculator logo

How far is Subang from Wuxi?

The distance between Wuxi (Sunan Shuofang International Airport) and Subang (Sultan Abdul Aziz Shah Airport) is 2306 miles / 3712 kilometers / 2004 nautical miles.

The driving distance from Wuxi (WUX) to Subang (SZB) is 3168 miles / 5098 kilometers, and travel time by car is about 60 hours 8 minutes.

Sunan Shuofang International Airport – Sultan Abdul Aziz Shah Airport

Distance arrow
2306
Miles
Distance arrow
3712
Kilometers
Distance arrow
2004
Nautical miles

Search flights

Distance from Wuxi to Subang

There are several ways to calculate the distance from Wuxi to Subang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2306.233 miles
  • 3711.522 kilometers
  • 2004.062 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2312.798 miles
  • 3722.088 kilometers
  • 2009.767 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuxi to Subang?

The estimated flight time from Sunan Shuofang International Airport to Sultan Abdul Aziz Shah Airport is 4 hours and 51 minutes.

What is the time difference between Wuxi and Subang?

There is no time difference between Wuxi and Subang.

Flight carbon footprint between Sunan Shuofang International Airport (WUX) and Sultan Abdul Aziz Shah Airport (SZB)

On average, flying from Wuxi to Subang generates about 253 kg of CO2 per passenger, and 253 kilograms equals 557 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuxi to Subang

See the map of the shortest flight path between Sunan Shuofang International Airport (WUX) and Sultan Abdul Aziz Shah Airport (SZB).

Airport information

Origin Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E
Destination Sultan Abdul Aziz Shah Airport
City: Subang
Country: Malaysia Flag of Malaysia
IATA Code: SZB
ICAO Code: WMSA
Coordinates: 3°7′50″N, 101°32′56″E