Air Miles Calculator logo

How far is Huaihua from Jizan?

The distance between Jizan (Jizan Regional Airport) and Huaihua (Huaihua Zhijiang Airport) is 4314 miles / 6942 kilometers / 3748 nautical miles.

The driving distance from Jizan (GIZ) to Huaihua (HJJ) is 6359 miles / 10234 kilometers, and travel time by car is about 122 hours 16 minutes.

Jizan Regional Airport – Huaihua Zhijiang Airport

Distance arrow
4314
Miles
Distance arrow
6942
Kilometers
Distance arrow
3748
Nautical miles

Search flights

Distance from Jizan to Huaihua

There are several ways to calculate the distance from Jizan to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 4313.519 miles
  • 6941.937 kilometers
  • 3748.346 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4307.361 miles
  • 6932.026 kilometers
  • 3742.995 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jizan to Huaihua?

The estimated flight time from Jizan Regional Airport to Huaihua Zhijiang Airport is 8 hours and 40 minutes.

Flight carbon footprint between Jizan Regional Airport (GIZ) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Jizan to Huaihua generates about 496 kg of CO2 per passenger, and 496 kilograms equals 1 093 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jizan to Huaihua

See the map of the shortest flight path between Jizan Regional Airport (GIZ) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Jizan Regional Airport
City: Jizan
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: GIZ
ICAO Code: OEGN
Coordinates: 16°54′3″N, 42°35′8″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E