Air Miles Calculator logo

How far is Yonago from Wuxi?

The distance between Wuxi (Sunan Shuofang International Airport) and Yonago (Miho-Yonago Airport) is 788 miles / 1269 kilometers / 685 nautical miles.

The driving distance from Wuxi (WUX) to Yonago (YGJ) is 2020 miles / 3251 kilometers, and travel time by car is about 41 hours 14 minutes.

Sunan Shuofang International Airport – Miho-Yonago Airport

Distance arrow
788
Miles
Distance arrow
1269
Kilometers
Distance arrow
685
Nautical miles

Search flights

Distance from Wuxi to Yonago

There are several ways to calculate the distance from Wuxi to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 788.416 miles
  • 1268.833 kilometers
  • 685.115 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 787.183 miles
  • 1266.848 kilometers
  • 684.043 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuxi to Yonago?

The estimated flight time from Sunan Shuofang International Airport to Miho-Yonago Airport is 1 hour and 59 minutes.

Flight carbon footprint between Sunan Shuofang International Airport (WUX) and Miho-Yonago Airport (YGJ)

On average, flying from Wuxi to Yonago generates about 134 kg of CO2 per passenger, and 134 kilograms equals 295 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuxi to Yonago

See the map of the shortest flight path between Sunan Shuofang International Airport (WUX) and Miho-Yonago Airport (YGJ).

Airport information

Origin Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E