Air Miles Calculator logo

How far is Shihezi from Wichita, KS?

The distance between Wichita (Wichita Dwight D. Eisenhower National Airport) and Shihezi (Shihezi Huayuan Airport) is 6791 miles / 10928 kilometers / 5901 nautical miles.

Wichita Dwight D. Eisenhower National Airport – Shihezi Huayuan Airport

Distance arrow
6791
Miles
Distance arrow
10928
Kilometers
Distance arrow
5901
Nautical miles

Search flights

Distance from Wichita to Shihezi

There are several ways to calculate the distance from Wichita to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 6790.600 miles
  • 10928.412 kilometers
  • 5900.870 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6774.796 miles
  • 10902.977 kilometers
  • 5887.136 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wichita to Shihezi?

The estimated flight time from Wichita Dwight D. Eisenhower National Airport to Shihezi Huayuan Airport is 13 hours and 21 minutes.

Flight carbon footprint between Wichita Dwight D. Eisenhower National Airport (ICT) and Shihezi Huayuan Airport (SHF)

On average, flying from Wichita to Shihezi generates about 825 kg of CO2 per passenger, and 825 kilograms equals 1 820 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Wichita to Shihezi

See the map of the shortest flight path between Wichita Dwight D. Eisenhower National Airport (ICT) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Wichita Dwight D. Eisenhower National Airport
City: Wichita, KS
Country: United States Flag of United States
IATA Code: ICT
ICAO Code: KICT
Coordinates: 37°39′0″N, 97°25′59″W
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E