Air Miles Calculator logo

How far is Parachinar from Baishan?

The distance between Baishan (Changbaishan Airport) and Parachinar (Parachinar Airport) is 3131 miles / 5039 kilometers / 2721 nautical miles.

The driving distance from Baishan (NBS) to Parachinar (PAJ) is 4072 miles / 6553 kilometers, and travel time by car is about 75 hours 21 minutes.

Changbaishan Airport – Parachinar Airport

Distance arrow
3131
Miles
Distance arrow
5039
Kilometers
Distance arrow
2721
Nautical miles

Search flights

Distance from Baishan to Parachinar

There are several ways to calculate the distance from Baishan to Parachinar. Here are two standard methods:

Vincenty's formula (applied above)
  • 3130.843 miles
  • 5038.603 kilometers
  • 2720.628 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3123.823 miles
  • 5027.306 kilometers
  • 2714.528 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baishan to Parachinar?

The estimated flight time from Changbaishan Airport to Parachinar Airport is 6 hours and 25 minutes.

Flight carbon footprint between Changbaishan Airport (NBS) and Parachinar Airport (PAJ)

On average, flying from Baishan to Parachinar generates about 350 kg of CO2 per passenger, and 350 kilograms equals 771 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baishan to Parachinar

See the map of the shortest flight path between Changbaishan Airport (NBS) and Parachinar Airport (PAJ).

Airport information

Origin Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E
Destination Parachinar Airport
City: Parachinar
Country: Pakistan Flag of Pakistan
IATA Code: PAJ
ICAO Code: OPPC
Coordinates: 33°54′7″N, 70°4′17″E