Air Miles Calculator logo

How far is Yibin from Gwadar?

The distance between Gwadar (Gwadar International Airport) and Yibin (Yibin Wuliangye Airport) is 2600 miles / 4184 kilometers / 2259 nautical miles.

The driving distance from Gwadar (GWD) to Yibin (YBP) is 3979 miles / 6404 kilometers, and travel time by car is about 74 hours 33 minutes.

Gwadar International Airport – Yibin Wuliangye Airport

Distance arrow
2600
Miles
Distance arrow
4184
Kilometers
Distance arrow
2259
Nautical miles

Search flights

Distance from Gwadar to Yibin

There are several ways to calculate the distance from Gwadar to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 2599.858 miles
  • 4184.065 kilometers
  • 2259.215 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2595.285 miles
  • 4176.706 kilometers
  • 2255.241 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Gwadar to Yibin?

The estimated flight time from Gwadar International Airport to Yibin Wuliangye Airport is 5 hours and 25 minutes.

Flight carbon footprint between Gwadar International Airport (GWD) and Yibin Wuliangye Airport (YBP)

On average, flying from Gwadar to Yibin generates about 287 kg of CO2 per passenger, and 287 kilograms equals 632 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Gwadar to Yibin

See the map of the shortest flight path between Gwadar International Airport (GWD) and Yibin Wuliangye Airport (YBP).

Airport information

Origin Gwadar International Airport
City: Gwadar
Country: Pakistan Flag of Pakistan
IATA Code: GWD
ICAO Code: OPGD
Coordinates: 25°13′59″N, 62°19′46″E
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E