How far is Shihezi from Batagay?
The distance between Batagay (Batagay Airport) and Shihezi (Shihezi Huayuan Airport) is 2385 miles / 3839 kilometers / 2073 nautical miles.
The driving distance from Batagay (BQJ) to Shihezi (SHF) is 4094 miles / 6588 kilometers, and travel time by car is about 109 hours 17 minutes.
Batagay Airport – Shihezi Huayuan Airport
Search flights
Distance from Batagay to Shihezi
There are several ways to calculate the distance from Batagay to Shihezi. Here are two standard methods:
Vincenty's formula (applied above)- 2385.306 miles
- 3838.778 kilometers
- 2072.774 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2379.640 miles
- 3829.660 kilometers
- 2067.851 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Batagay to Shihezi?
The estimated flight time from Batagay Airport to Shihezi Huayuan Airport is 5 hours and 0 minutes.
What is the time difference between Batagay and Shihezi?
The time difference between Batagay and Shihezi is 4 hours. Shihezi is 4 hours behind Batagay.
Flight carbon footprint between Batagay Airport (BQJ) and Shihezi Huayuan Airport (SHF)
On average, flying from Batagay to Shihezi generates about 262 kg of CO2 per passenger, and 262 kilograms equals 577 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Batagay to Shihezi
See the map of the shortest flight path between Batagay Airport (BQJ) and Shihezi Huayuan Airport (SHF).
Airport information
Origin | Batagay Airport |
---|---|
City: | Batagay |
Country: | Russia |
IATA Code: | BQJ |
ICAO Code: | UEBB |
Coordinates: | 67°38′52″N, 134°41′42″E |
Destination | Shihezi Huayuan Airport |
---|---|
City: | Shihezi |
Country: | China |
IATA Code: | SHF |
ICAO Code: | ZWHZ |
Coordinates: | 44°14′31″N, 85°53′25″E |