Air Miles Calculator logo

How far is Baishan from Nakashibetsu?

The distance between Nakashibetsu (Nakashibetsu Airport) and Baishan (Changbaishan Airport) is 886 miles / 1427 kilometers / 770 nautical miles.

The driving distance from Nakashibetsu (SHB) to Baishan (NBS) is 2663 miles / 4286 kilometers, and travel time by car is about 65 hours 29 minutes.

Nakashibetsu Airport – Changbaishan Airport

Distance arrow
886
Miles
Distance arrow
1427
Kilometers
Distance arrow
770
Nautical miles

Search flights

Distance from Nakashibetsu to Baishan

There are several ways to calculate the distance from Nakashibetsu to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 886.498 miles
  • 1426.681 kilometers
  • 770.346 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 884.182 miles
  • 1422.953 kilometers
  • 768.333 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakashibetsu to Baishan?

The estimated flight time from Nakashibetsu Airport to Changbaishan Airport is 2 hours and 10 minutes.

Flight carbon footprint between Nakashibetsu Airport (SHB) and Changbaishan Airport (NBS)

On average, flying from Nakashibetsu to Baishan generates about 142 kg of CO2 per passenger, and 142 kilograms equals 314 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakashibetsu to Baishan

See the map of the shortest flight path between Nakashibetsu Airport (SHB) and Changbaishan Airport (NBS).

Airport information

Origin Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E