Air Miles Calculator logo

How far is Shihezi from Bayda?

The distance between Bayda (Al Abraq International Airport) and Shihezi (Shihezi Huayuan Airport) is 3464 miles / 5574 kilometers / 3010 nautical miles.

Al Abraq International Airport – Shihezi Huayuan Airport

Distance arrow
3464
Miles
Distance arrow
5574
Kilometers
Distance arrow
3010
Nautical miles

Search flights

Distance from Bayda to Shihezi

There are several ways to calculate the distance from Bayda to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 3463.535 miles
  • 5574.019 kilometers
  • 3009.729 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3455.957 miles
  • 5561.824 kilometers
  • 3003.145 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayda to Shihezi?

The estimated flight time from Al Abraq International Airport to Shihezi Huayuan Airport is 7 hours and 3 minutes.

Flight carbon footprint between Al Abraq International Airport (LAQ) and Shihezi Huayuan Airport (SHF)

On average, flying from Bayda to Shihezi generates about 390 kg of CO2 per passenger, and 390 kilograms equals 860 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bayda to Shihezi

See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E