Air Miles Calculator logo

How far is Wuxi from Kashgar?

The distance between Kashgar (Kashgar Airport) and Wuxi (Sunan Shuofang International Airport) is 2537 miles / 4083 kilometers / 2205 nautical miles.

The driving distance from Kashgar (KHG) to Wuxi (WUX) is 3077 miles / 4952 kilometers, and travel time by car is about 56 hours 1 minutes.

Kashgar Airport – Sunan Shuofang International Airport

Distance arrow
2537
Miles
Distance arrow
4083
Kilometers
Distance arrow
2205
Nautical miles

Search flights

Distance from Kashgar to Wuxi

There are several ways to calculate the distance from Kashgar to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2537.213 miles
  • 4083.248 kilometers
  • 2204.778 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2532.062 miles
  • 4074.959 kilometers
  • 2200.302 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kashgar to Wuxi?

The estimated flight time from Kashgar Airport to Sunan Shuofang International Airport is 5 hours and 18 minutes.

Flight carbon footprint between Kashgar Airport (KHG) and Sunan Shuofang International Airport (WUX)

On average, flying from Kashgar to Wuxi generates about 280 kg of CO2 per passenger, and 280 kilograms equals 616 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kashgar to Wuxi

See the map of the shortest flight path between Kashgar Airport (KHG) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Kashgar Airport
City: Kashgar
Country: China Flag of China
IATA Code: KHG
ICAO Code: ZWSH
Coordinates: 39°32′34″N, 76°1′11″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E