Air Miles Calculator logo

How far is Semey from Wuhan?

The distance between Wuhan (Wuhan Tianhe International Airport) and Semey (Semey Airport) is 2209 miles / 3555 kilometers / 1919 nautical miles.

The driving distance from Wuhan (WUH) to Semey (PLX) is 2732 miles / 4397 kilometers, and travel time by car is about 50 hours 6 minutes.

Wuhan Tianhe International Airport – Semey Airport

Distance arrow
2209
Miles
Distance arrow
3555
Kilometers
Distance arrow
1919
Nautical miles

Search flights

Distance from Wuhan to Semey

There are several ways to calculate the distance from Wuhan to Semey. Here are two standard methods:

Vincenty's formula (applied above)
  • 2208.699 miles
  • 3554.556 kilometers
  • 1919.307 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2206.330 miles
  • 3550.744 kilometers
  • 1917.248 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuhan to Semey?

The estimated flight time from Wuhan Tianhe International Airport to Semey Airport is 4 hours and 40 minutes.

Flight carbon footprint between Wuhan Tianhe International Airport (WUH) and Semey Airport (PLX)

On average, flying from Wuhan to Semey generates about 241 kg of CO2 per passenger, and 241 kilograms equals 532 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuhan to Semey

See the map of the shortest flight path between Wuhan Tianhe International Airport (WUH) and Semey Airport (PLX).

Airport information

Origin Wuhan Tianhe International Airport
City: Wuhan
Country: China Flag of China
IATA Code: WUH
ICAO Code: ZHHH
Coordinates: 30°47′1″N, 114°12′28″E
Destination Semey Airport
City: Semey
Country: Kazakhstan Flag of Kazakhstan
IATA Code: PLX
ICAO Code: UASS
Coordinates: 50°21′4″N, 80°14′3″E