Air Miles Calculator logo

How far is Huaihua from Anchorage, AK?

The distance between Anchorage (Ted Stevens Anchorage International Airport) and Huaihua (Huaihua Zhijiang Airport) is 4908 miles / 7899 kilometers / 4265 nautical miles.

Ted Stevens Anchorage International Airport – Huaihua Zhijiang Airport

Distance arrow
4908
Miles
Distance arrow
7899
Kilometers
Distance arrow
4265
Nautical miles

Search flights

Distance from Anchorage to Huaihua

There are several ways to calculate the distance from Anchorage to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 4907.975 miles
  • 7898.620 kilometers
  • 4264.913 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4898.812 miles
  • 7883.873 kilometers
  • 4256.951 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Anchorage to Huaihua?

The estimated flight time from Ted Stevens Anchorage International Airport to Huaihua Zhijiang Airport is 9 hours and 47 minutes.

Flight carbon footprint between Ted Stevens Anchorage International Airport (ANC) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Anchorage to Huaihua generates about 572 kg of CO2 per passenger, and 572 kilograms equals 1 261 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Anchorage to Huaihua

See the map of the shortest flight path between Ted Stevens Anchorage International Airport (ANC) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Ted Stevens Anchorage International Airport
City: Anchorage, AK
Country: United States Flag of United States
IATA Code: ANC
ICAO Code: PANC
Coordinates: 61°10′27″N, 149°59′45″W
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E