Air Miles Calculator logo

How far is Nakashibetsu from Yibin?

The distance between Yibin (Yibin Wuliangye Airport) and Nakashibetsu (Nakashibetsu Airport) is 2446 miles / 3937 kilometers / 2126 nautical miles.

The driving distance from Yibin (YBP) to Nakashibetsu (SHB) is 3855 miles / 6204 kilometers, and travel time by car is about 77 hours 5 minutes.

Yibin Wuliangye Airport – Nakashibetsu Airport

Distance arrow
2446
Miles
Distance arrow
3937
Kilometers
Distance arrow
2126
Nautical miles

Search flights

Distance from Yibin to Nakashibetsu

There are several ways to calculate the distance from Yibin to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2446.202 miles
  • 3936.780 kilometers
  • 2125.691 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2442.468 miles
  • 3930.771 kilometers
  • 2122.446 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yibin to Nakashibetsu?

The estimated flight time from Yibin Wuliangye Airport to Nakashibetsu Airport is 5 hours and 7 minutes.

Flight carbon footprint between Yibin Wuliangye Airport (YBP) and Nakashibetsu Airport (SHB)

On average, flying from Yibin to Nakashibetsu generates about 269 kg of CO2 per passenger, and 269 kilograms equals 593 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yibin to Nakashibetsu

See the map of the shortest flight path between Yibin Wuliangye Airport (YBP) and Nakashibetsu Airport (SHB).

Airport information

Origin Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E