How far is Chita from Hami?
The distance between Hami (Hami Airport) and Chita (Chita-Kadala International Airport) is 1112 miles / 1790 kilometers / 967 nautical miles.
The driving distance from Hami (HMI) to Chita (HTA) is 1717 miles / 2764 kilometers, and travel time by car is about 38 hours 31 minutes.
Hami Airport – Chita-Kadala International Airport
Search flights
Distance from Hami to Chita
There are several ways to calculate the distance from Hami to Chita. Here are two standard methods:
Vincenty's formula (applied above)- 1112.252 miles
- 1789.996 kilometers
- 966.521 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1110.111 miles
- 1786.551 kilometers
- 964.660 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hami to Chita?
The estimated flight time from Hami Airport to Chita-Kadala International Airport is 2 hours and 36 minutes.
What is the time difference between Hami and Chita?
The time difference between Hami and Chita is 1 hour. Chita is 1 hour ahead of Hami.
Flight carbon footprint between Hami Airport (HMI) and Chita-Kadala International Airport (HTA)
On average, flying from Hami to Chita generates about 157 kg of CO2 per passenger, and 157 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Hami to Chita
See the map of the shortest flight path between Hami Airport (HMI) and Chita-Kadala International Airport (HTA).
Airport information
Origin | Hami Airport |
---|---|
City: | Hami |
Country: | China |
IATA Code: | HMI |
ICAO Code: | ZWHM |
Coordinates: | 42°50′29″N, 93°40′9″E |
Destination | Chita-Kadala International Airport |
---|---|
City: | Chita |
Country: | Russia |
IATA Code: | HTA |
ICAO Code: | UIAA |
Coordinates: | 52°1′34″N, 113°18′21″E |