Air Miles Calculator logo

How far is Huaihua from Ozar?

The distance between Ozar (Nashik Airport) and Huaihua (Huaihua Zhijiang Airport) is 2313 miles / 3723 kilometers / 2010 nautical miles.

The driving distance from Ozar (ISK) to Huaihua (HJJ) is 3153 miles / 5075 kilometers, and travel time by car is about 62 hours 24 minutes.

Nashik Airport – Huaihua Zhijiang Airport

Distance arrow
2313
Miles
Distance arrow
3723
Kilometers
Distance arrow
2010
Nautical miles
Flight time duration
4 h 52 min
Time Difference
2 h 30 min
CO2 emission
253 kg

Search flights

Distance from Ozar to Huaihua

There are several ways to calculate the distance from Ozar to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 2313.382 miles
  • 3723.027 kilometers
  • 2010.274 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2310.155 miles
  • 3717.835 kilometers
  • 2007.470 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ozar to Huaihua?

The estimated flight time from Nashik Airport to Huaihua Zhijiang Airport is 4 hours and 52 minutes.

Flight carbon footprint between Nashik Airport (ISK) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Ozar to Huaihua generates about 253 kg of CO2 per passenger, and 253 kilograms equals 559 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ozar to Huaihua

See the map of the shortest flight path between Nashik Airport (ISK) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Nashik Airport
City: Ozar
Country: India Flag of India
IATA Code: ISK
ICAO Code: VAOZ
Coordinates: 20°7′8″N, 73°54′46″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E