Air Miles Calculator logo

How far is Parachinar from Xingyi?

The distance between Xingyi (Xingyi Wanfenglin Airport) and Parachinar (Parachinar Airport) is 2176 miles / 3503 kilometers / 1891 nautical miles.

The driving distance from Xingyi (ACX) to Parachinar (PAJ) is 3570 miles / 5746 kilometers, and travel time by car is about 68 hours 50 minutes.

Xingyi Wanfenglin Airport – Parachinar Airport

Distance arrow
2176
Miles
Distance arrow
3503
Kilometers
Distance arrow
1891
Nautical miles

Search flights

Distance from Xingyi to Parachinar

There are several ways to calculate the distance from Xingyi to Parachinar. Here are two standard methods:

Vincenty's formula (applied above)
  • 2176.394 miles
  • 3502.567 kilometers
  • 1891.235 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2173.062 miles
  • 3497.204 kilometers
  • 1888.339 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xingyi to Parachinar?

The estimated flight time from Xingyi Wanfenglin Airport to Parachinar Airport is 4 hours and 37 minutes.

Flight carbon footprint between Xingyi Wanfenglin Airport (ACX) and Parachinar Airport (PAJ)

On average, flying from Xingyi to Parachinar generates about 238 kg of CO2 per passenger, and 238 kilograms equals 524 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xingyi to Parachinar

See the map of the shortest flight path between Xingyi Wanfenglin Airport (ACX) and Parachinar Airport (PAJ).

Airport information

Origin Xingyi Wanfenglin Airport
City: Xingyi
Country: China Flag of China
IATA Code: ACX
ICAO Code: ZUYI
Coordinates: 25°5′11″N, 104°57′33″E
Destination Parachinar Airport
City: Parachinar
Country: Pakistan Flag of Pakistan
IATA Code: PAJ
ICAO Code: OPPC
Coordinates: 33°54′7″N, 70°4′17″E