Air Miles Calculator logo

How far is Shihezi from Guatemala City?

The distance between Guatemala City (La Aurora International Airport) and Shihezi (Shihezi Huayuan Airport) is 8376 miles / 13480 kilometers / 7279 nautical miles.

La Aurora International Airport – Shihezi Huayuan Airport

Distance arrow
8376
Miles
Distance arrow
13480
Kilometers
Distance arrow
7279
Nautical miles
Flight time duration
16 h 21 min
CO2 emission
1 054 kg

Search flights

Distance from Guatemala City to Shihezi

There are several ways to calculate the distance from Guatemala City to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 8376.243 miles
  • 13480.257 kilometers
  • 7278.756 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8366.103 miles
  • 13463.938 kilometers
  • 7269.945 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guatemala City to Shihezi?

The estimated flight time from La Aurora International Airport to Shihezi Huayuan Airport is 16 hours and 21 minutes.

Flight carbon footprint between La Aurora International Airport (GUA) and Shihezi Huayuan Airport (SHF)

On average, flying from Guatemala City to Shihezi generates about 1 054 kg of CO2 per passenger, and 1 054 kilograms equals 2 323 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guatemala City to Shihezi

See the map of the shortest flight path between La Aurora International Airport (GUA) and Shihezi Huayuan Airport (SHF).

Airport information

Origin La Aurora International Airport
City: Guatemala City
Country: Guatemala Flag of Guatemala
IATA Code: GUA
ICAO Code: MGGT
Coordinates: 14°34′59″N, 90°31′38″W
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E