How far is Shihezi from Omitama?
The distance between Omitama (Ibaraki Airport) and Shihezi (Shihezi Huayuan Airport) is 2883 miles / 4640 kilometers / 2505 nautical miles.
The driving distance from Omitama (IBR) to Shihezi (SHF) is 3681 miles / 5924 kilometers, and travel time by car is about 81 hours 14 minutes.
Ibaraki Airport – Shihezi Huayuan Airport
Search flights
Distance from Omitama to Shihezi
There are several ways to calculate the distance from Omitama to Shihezi. Here are two standard methods:
Vincenty's formula (applied above)- 2883.092 miles
- 4639.886 kilometers
- 2505.338 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2876.290 miles
- 4628.941 kilometers
- 2499.428 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Omitama to Shihezi?
The estimated flight time from Ibaraki Airport to Shihezi Huayuan Airport is 5 hours and 57 minutes.
What is the time difference between Omitama and Shihezi?
The time difference between Omitama and Shihezi is 3 hours. Shihezi is 3 hours behind Omitama.
Flight carbon footprint between Ibaraki Airport (IBR) and Shihezi Huayuan Airport (SHF)
On average, flying from Omitama to Shihezi generates about 320 kg of CO2 per passenger, and 320 kilograms equals 706 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Omitama to Shihezi
See the map of the shortest flight path between Ibaraki Airport (IBR) and Shihezi Huayuan Airport (SHF).
Airport information
Origin | Ibaraki Airport |
---|---|
City: | Omitama |
Country: | Japan |
IATA Code: | IBR |
ICAO Code: | RJAH |
Coordinates: | 36°10′51″N, 140°24′53″E |
Destination | Shihezi Huayuan Airport |
---|---|
City: | Shihezi |
Country: | China |
IATA Code: | SHF |
ICAO Code: | ZWHZ |
Coordinates: | 44°14′31″N, 85°53′25″E |