Air Miles Calculator logo

How far is Yibin from Manila?

The distance between Manila (Ninoy Aquino International Airport) and Yibin (Yibin Wuliangye Airport) is 1446 miles / 2327 kilometers / 1256 nautical miles.

Ninoy Aquino International Airport – Yibin Wuliangye Airport

Distance arrow
1446
Miles
Distance arrow
2327
Kilometers
Distance arrow
1256
Nautical miles

Search flights

Distance from Manila to Yibin

There are several ways to calculate the distance from Manila to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 1445.732 miles
  • 2326.681 kilometers
  • 1256.307 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1447.348 miles
  • 2329.281 kilometers
  • 1257.711 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manila to Yibin?

The estimated flight time from Ninoy Aquino International Airport to Yibin Wuliangye Airport is 3 hours and 14 minutes.

What is the time difference between Manila and Yibin?

There is no time difference between Manila and Yibin.

Flight carbon footprint between Ninoy Aquino International Airport (MNL) and Yibin Wuliangye Airport (YBP)

On average, flying from Manila to Yibin generates about 176 kg of CO2 per passenger, and 176 kilograms equals 388 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Manila to Yibin

See the map of the shortest flight path between Ninoy Aquino International Airport (MNL) and Yibin Wuliangye Airport (YBP).

Airport information

Origin Ninoy Aquino International Airport
City: Manila
Country: Philippines Flag of Philippines
IATA Code: MNL
ICAO Code: RPLL
Coordinates: 14°30′30″N, 121°1′11″E
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E