Air Miles Calculator logo

How far is Shihezi from Yanji?

The distance between Yanji (Yanji Chaoyangchuan International Airport) and Shihezi (Shihezi Huayuan Airport) is 2163 miles / 3481 kilometers / 1880 nautical miles.

The driving distance from Yanji (YNJ) to Shihezi (SHF) is 2543 miles / 4092 kilometers, and travel time by car is about 46 hours 52 minutes.

Yanji Chaoyangchuan International Airport – Shihezi Huayuan Airport

Distance arrow
2163
Miles
Distance arrow
3481
Kilometers
Distance arrow
1880
Nautical miles

Search flights

Distance from Yanji to Shihezi

There are several ways to calculate the distance from Yanji to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2162.997 miles
  • 3481.005 kilometers
  • 1879.593 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2157.159 miles
  • 3471.611 kilometers
  • 1874.520 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yanji to Shihezi?

The estimated flight time from Yanji Chaoyangchuan International Airport to Shihezi Huayuan Airport is 4 hours and 35 minutes.

Flight carbon footprint between Yanji Chaoyangchuan International Airport (YNJ) and Shihezi Huayuan Airport (SHF)

On average, flying from Yanji to Shihezi generates about 236 kg of CO2 per passenger, and 236 kilograms equals 521 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yanji to Shihezi

See the map of the shortest flight path between Yanji Chaoyangchuan International Airport (YNJ) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E