Air Miles Calculator logo

How far is Shihezi from Puerto Plata?

The distance between Puerto Plata (Gregorio Luperón International Airport) and Shihezi (Shihezi Huayuan Airport) is 7783 miles / 12526 kilometers / 6763 nautical miles.

Gregorio Luperón International Airport – Shihezi Huayuan Airport

Distance arrow
7783
Miles
Distance arrow
12526
Kilometers
Distance arrow
6763
Nautical miles

Search flights

Distance from Puerto Plata to Shihezi

There are several ways to calculate the distance from Puerto Plata to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 7783.105 miles
  • 12525.693 kilometers
  • 6763.333 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7771.220 miles
  • 12506.566 kilometers
  • 6753.005 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Puerto Plata to Shihezi?

The estimated flight time from Gregorio Luperón International Airport to Shihezi Huayuan Airport is 15 hours and 14 minutes.

Flight carbon footprint between Gregorio Luperón International Airport (POP) and Shihezi Huayuan Airport (SHF)

On average, flying from Puerto Plata to Shihezi generates about 967 kg of CO2 per passenger, and 967 kilograms equals 2 131 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Puerto Plata to Shihezi

See the map of the shortest flight path between Gregorio Luperón International Airport (POP) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Gregorio Luperón International Airport
City: Puerto Plata
Country: Dominican Republic Flag of Dominican Republic
IATA Code: POP
ICAO Code: MDPP
Coordinates: 19°45′28″N, 70°34′11″W
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E