How far is Huaihua from Hambantota?
The distance between Hambantota (Hambantota Mattala Rajapaksa International Airport) and Huaihua (Huaihua Zhijiang Airport) is 2375 miles / 3822 kilometers / 2063 nautical miles.
Hambantota Mattala Rajapaksa International Airport – Huaihua Zhijiang Airport
Search flights
Distance from Hambantota to Huaihua
There are several ways to calculate the distance from Hambantota to Huaihua. Here are two standard methods:
Vincenty's formula (applied above)- 2374.584 miles
- 3821.522 kilometers
- 2063.457 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2376.646 miles
- 3824.841 kilometers
- 2065.249 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hambantota to Huaihua?
The estimated flight time from Hambantota Mattala Rajapaksa International Airport to Huaihua Zhijiang Airport is 4 hours and 59 minutes.
What is the time difference between Hambantota and Huaihua?
Flight carbon footprint between Hambantota Mattala Rajapaksa International Airport (HRI) and Huaihua Zhijiang Airport (HJJ)
On average, flying from Hambantota to Huaihua generates about 261 kg of CO2 per passenger, and 261 kilograms equals 574 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Hambantota to Huaihua
See the map of the shortest flight path between Hambantota Mattala Rajapaksa International Airport (HRI) and Huaihua Zhijiang Airport (HJJ).
Airport information
Origin | Hambantota Mattala Rajapaksa International Airport |
---|---|
City: | Hambantota |
Country: | Sri Lanka |
IATA Code: | HRI |
ICAO Code: | VCRI |
Coordinates: | 6°17′4″N, 81°7′26″E |
Destination | Huaihua Zhijiang Airport |
---|---|
City: | Huaihua |
Country: | China |
IATA Code: | HJJ |
ICAO Code: | ZGCJ |
Coordinates: | 27°26′27″N, 109°42′0″E |