Air Miles Calculator logo

How far is Shihezi from Jharsuguda?

The distance between Jharsuguda (Jharsuguda Airport) and Shihezi (Shihezi Huayuan Airport) is 1542 miles / 2482 kilometers / 1340 nautical miles.

The driving distance from Jharsuguda (JRG) to Shihezi (SHF) is 2756 miles / 4435 kilometers, and travel time by car is about 52 hours 14 minutes.

Jharsuguda Airport – Shihezi Huayuan Airport

Distance arrow
1542
Miles
Distance arrow
2482
Kilometers
Distance arrow
1340
Nautical miles
Flight time duration
3 h 25 min
CO2 emission
182 kg

Search flights

Distance from Jharsuguda to Shihezi

There are several ways to calculate the distance from Jharsuguda to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1542.418 miles
  • 2482.281 kilometers
  • 1340.324 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1546.327 miles
  • 2488.572 kilometers
  • 1343.721 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jharsuguda to Shihezi?

The estimated flight time from Jharsuguda Airport to Shihezi Huayuan Airport is 3 hours and 25 minutes.

Flight carbon footprint between Jharsuguda Airport (JRG) and Shihezi Huayuan Airport (SHF)

On average, flying from Jharsuguda to Shihezi generates about 182 kg of CO2 per passenger, and 182 kilograms equals 402 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jharsuguda to Shihezi

See the map of the shortest flight path between Jharsuguda Airport (JRG) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E