How far is Shihezi from Bayda?
The distance between Bayda (Al Abraq International Airport) and Shihezi (Shihezi Huayuan Airport) is 3464 miles / 5574 kilometers / 3010 nautical miles.
Al Abraq International Airport – Shihezi Huayuan Airport
Search flights
Distance from Bayda to Shihezi
There are several ways to calculate the distance from Bayda to Shihezi. Here are two standard methods:
Vincenty's formula (applied above)- 3463.535 miles
- 5574.019 kilometers
- 3009.729 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3455.957 miles
- 5561.824 kilometers
- 3003.145 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bayda to Shihezi?
The estimated flight time from Al Abraq International Airport to Shihezi Huayuan Airport is 7 hours and 3 minutes.
What is the time difference between Bayda and Shihezi?
The time difference between Bayda and Shihezi is 4 hours. Shihezi is 4 hours ahead of Bayda.
Flight carbon footprint between Al Abraq International Airport (LAQ) and Shihezi Huayuan Airport (SHF)
On average, flying from Bayda to Shihezi generates about 390 kg of CO2 per passenger, and 390 kilograms equals 860 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Bayda to Shihezi
See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Shihezi Huayuan Airport (SHF).
Airport information
Origin | Al Abraq International Airport |
---|---|
City: | Bayda |
Country: | Libya |
IATA Code: | LAQ |
ICAO Code: | HLLQ |
Coordinates: | 32°47′19″N, 21°57′51″E |
Destination | Shihezi Huayuan Airport |
---|---|
City: | Shihezi |
Country: | China |
IATA Code: | SHF |
ICAO Code: | ZWHZ |
Coordinates: | 44°14′31″N, 85°53′25″E |