Air Miles Calculator logo

How far is Wuxi from Gander?

The distance between Gander (Gander International Airport) and Wuxi (Sunan Shuofang International Airport) is 6886 miles / 11082 kilometers / 5984 nautical miles.

Gander International Airport – Sunan Shuofang International Airport

Distance arrow
6886
Miles
Distance arrow
11082
Kilometers
Distance arrow
5984
Nautical miles
Flight time duration
13 h 32 min
Time Difference
11 h 30 min
CO2 emission
839 kg

Search flights

Distance from Gander to Wuxi

There are several ways to calculate the distance from Gander to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 6885.868 miles
  • 11081.730 kilometers
  • 5983.656 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6870.977 miles
  • 11057.765 kilometers
  • 5970.715 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Gander to Wuxi?

The estimated flight time from Gander International Airport to Sunan Shuofang International Airport is 13 hours and 32 minutes.

Flight carbon footprint between Gander International Airport (YQX) and Sunan Shuofang International Airport (WUX)

On average, flying from Gander to Wuxi generates about 839 kg of CO2 per passenger, and 839 kilograms equals 1 849 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Gander to Wuxi

See the map of the shortest flight path between Gander International Airport (YQX) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Gander International Airport
City: Gander
Country: Canada Flag of Canada
IATA Code: YQX
ICAO Code: CYQX
Coordinates: 48°56′12″N, 54°34′5″W
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E