Air Miles Calculator logo

How far is Panjgur from Kuwait City?

The distance between Kuwait City (Kuwait International Airport) and Panjgur (Panjgur Airport) is 999 miles / 1607 kilometers / 868 nautical miles.

The driving distance from Kuwait City (KWI) to Panjgur (PJG) is 1561 miles / 2512 kilometers, and travel time by car is about 33 hours 51 minutes.

Kuwait International Airport – Panjgur Airport

Distance arrow
999
Miles
Distance arrow
1607
Kilometers
Distance arrow
868
Nautical miles

Search flights

Distance from Kuwait City to Panjgur

There are several ways to calculate the distance from Kuwait City to Panjgur. Here are two standard methods:

Vincenty's formula (applied above)
  • 998.563 miles
  • 1607.031 kilometers
  • 867.728 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 996.835 miles
  • 1604.251 kilometers
  • 866.226 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kuwait City to Panjgur?

The estimated flight time from Kuwait International Airport to Panjgur Airport is 2 hours and 23 minutes.

Flight carbon footprint between Kuwait International Airport (KWI) and Panjgur Airport (PJG)

On average, flying from Kuwait City to Panjgur generates about 151 kg of CO2 per passenger, and 151 kilograms equals 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kuwait City to Panjgur

See the map of the shortest flight path between Kuwait International Airport (KWI) and Panjgur Airport (PJG).

Airport information

Origin Kuwait International Airport
City: Kuwait City
Country: Kuwait Flag of Kuwait
IATA Code: KWI
ICAO Code: OKBK
Coordinates: 29°13′35″N, 47°58′8″E
Destination Panjgur Airport
City: Panjgur
Country: Pakistan Flag of Pakistan
IATA Code: PJG
ICAO Code: OPPG
Coordinates: 26°57′16″N, 64°7′56″E