Air Miles Calculator logo

How far is Nakashibetsu from Zunyi?

The distance between Zunyi (Zunyi Xinzhou Airport) and Nakashibetsu (Nakashibetsu Airport) is 2377 miles / 3826 kilometers / 2066 nautical miles.

The driving distance from Zunyi (ZYI) to Nakashibetsu (SHB) is 3839 miles / 6178 kilometers, and travel time by car is about 76 hours 57 minutes.

Zunyi Xinzhou Airport – Nakashibetsu Airport

Distance arrow
2377
Miles
Distance arrow
3826
Kilometers
Distance arrow
2066
Nautical miles

Search flights

Distance from Zunyi to Nakashibetsu

There are several ways to calculate the distance from Zunyi to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2377.247 miles
  • 3825.808 kilometers
  • 2065.771 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2374.180 miles
  • 3820.873 kilometers
  • 2063.106 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Zunyi to Nakashibetsu?

The estimated flight time from Zunyi Xinzhou Airport to Nakashibetsu Airport is 5 hours and 0 minutes.

Flight carbon footprint between Zunyi Xinzhou Airport (ZYI) and Nakashibetsu Airport (SHB)

On average, flying from Zunyi to Nakashibetsu generates about 261 kg of CO2 per passenger, and 261 kilograms equals 575 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Zunyi to Nakashibetsu

See the map of the shortest flight path between Zunyi Xinzhou Airport (ZYI) and Nakashibetsu Airport (SHB).

Airport information

Origin Zunyi Xinzhou Airport
City: Zunyi
Country: China Flag of China
IATA Code: ZYI
ICAO Code: ZUZY
Coordinates: 27°35′22″N, 107°0′2″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E