Air Miles Calculator logo

How far is Yibin from Kandla?

The distance between Kandla (Kandla Airport) and Yibin (Yibin Wuliangye Airport) is 2170 miles / 3493 kilometers / 1886 nautical miles.

The driving distance from Kandla (IXY) to Yibin (YBP) is 3116 miles / 5015 kilometers, and travel time by car is about 60 hours 13 minutes.

Kandla Airport – Yibin Wuliangye Airport

Distance arrow
2170
Miles
Distance arrow
3493
Kilometers
Distance arrow
1886
Nautical miles
Flight time duration
4 h 36 min
Time Difference
2 h 30 min
CO2 emission
237 kg

Search flights

Distance from Kandla to Yibin

There are several ways to calculate the distance from Kandla to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 2170.261 miles
  • 3492.696 kilometers
  • 1885.905 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2166.834 miles
  • 3487.182 kilometers
  • 1882.927 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kandla to Yibin?

The estimated flight time from Kandla Airport to Yibin Wuliangye Airport is 4 hours and 36 minutes.

Flight carbon footprint between Kandla Airport (IXY) and Yibin Wuliangye Airport (YBP)

On average, flying from Kandla to Yibin generates about 237 kg of CO2 per passenger, and 237 kilograms equals 522 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kandla to Yibin

See the map of the shortest flight path between Kandla Airport (IXY) and Yibin Wuliangye Airport (YBP).

Airport information

Origin Kandla Airport
City: Kandla
Country: India Flag of India
IATA Code: IXY
ICAO Code: VAKE
Coordinates: 23°6′45″N, 70°6′1″E
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E