Air Miles Calculator logo

How far is Huaihua from Pakse?

The distance between Pakse (Pakse International Airport) and Huaihua (Huaihua Zhijiang Airport) is 884 miles / 1422 kilometers / 768 nautical miles.

The driving distance from Pakse (PKZ) to Huaihua (HJJ) is 1225 miles / 1972 kilometers, and travel time by car is about 23 hours 19 minutes.

Pakse International Airport – Huaihua Zhijiang Airport

Distance arrow
884
Miles
Distance arrow
1422
Kilometers
Distance arrow
768
Nautical miles

Search flights

Distance from Pakse to Huaihua

There are several ways to calculate the distance from Pakse to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 883.562 miles
  • 1421.955 kilometers
  • 767.794 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 886.904 miles
  • 1427.334 kilometers
  • 770.699 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pakse to Huaihua?

The estimated flight time from Pakse International Airport to Huaihua Zhijiang Airport is 2 hours and 10 minutes.

Flight carbon footprint between Pakse International Airport (PKZ) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Pakse to Huaihua generates about 142 kg of CO2 per passenger, and 142 kilograms equals 314 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Pakse to Huaihua

See the map of the shortest flight path between Pakse International Airport (PKZ) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Pakse International Airport
City: Pakse
Country: Laos Flag of Laos
IATA Code: PKZ
ICAO Code: VLPS
Coordinates: 15°7′55″N, 105°46′51″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E