Air Miles Calculator logo

How far is Huaihua from Lashio?

The distance between Lashio (Lashio Airport) and Huaihua (Huaihua Zhijiang Airport) is 808 miles / 1301 kilometers / 702 nautical miles.

The driving distance from Lashio (LSH) to Huaihua (HJJ) is 1099 miles / 1769 kilometers, and travel time by car is about 20 hours 5 minutes.

Lashio Airport – Huaihua Zhijiang Airport

Distance arrow
808
Miles
Distance arrow
1301
Kilometers
Distance arrow
702
Nautical miles
Flight time duration
2 h 1 min
Time Difference
1 h 30 min
CO2 emission
136 kg

Search flights

Distance from Lashio to Huaihua

There are several ways to calculate the distance from Lashio to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 808.279 miles
  • 1300.799 kilometers
  • 702.375 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 807.529 miles
  • 1299.592 kilometers
  • 701.724 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lashio to Huaihua?

The estimated flight time from Lashio Airport to Huaihua Zhijiang Airport is 2 hours and 1 minutes.

Flight carbon footprint between Lashio Airport (LSH) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Lashio to Huaihua generates about 136 kg of CO2 per passenger, and 136 kilograms equals 299 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lashio to Huaihua

See the map of the shortest flight path between Lashio Airport (LSH) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Lashio Airport
City: Lashio
Country: Burma Flag of Burma
IATA Code: LSH
ICAO Code: VYLS
Coordinates: 22°58′40″N, 97°45′7″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E