Air Miles Calculator logo

How far is Nakashibetsu from Arxan?

The distance between Arxan (Arxan Yi'ershi Airport) and Nakashibetsu (Nakashibetsu Airport) is 1239 miles / 1994 kilometers / 1077 nautical miles.

The driving distance from Arxan (YIE) to Nakashibetsu (SHB) is 2813 miles / 4527 kilometers, and travel time by car is about 58 hours 37 minutes.

Arxan Yi'ershi Airport – Nakashibetsu Airport

Distance arrow
1239
Miles
Distance arrow
1994
Kilometers
Distance arrow
1077
Nautical miles

Search flights

Distance from Arxan to Nakashibetsu

There are several ways to calculate the distance from Arxan to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1238.949 miles
  • 1993.895 kilometers
  • 1076.617 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1235.638 miles
  • 1988.567 kilometers
  • 1073.741 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Arxan to Nakashibetsu?

The estimated flight time from Arxan Yi'ershi Airport to Nakashibetsu Airport is 2 hours and 50 minutes.

Flight carbon footprint between Arxan Yi'ershi Airport (YIE) and Nakashibetsu Airport (SHB)

On average, flying from Arxan to Nakashibetsu generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Arxan to Nakashibetsu

See the map of the shortest flight path between Arxan Yi'ershi Airport (YIE) and Nakashibetsu Airport (SHB).

Airport information

Origin Arxan Yi'ershi Airport
City: Arxan
Country: China Flag of China
IATA Code: YIE
ICAO Code: ZBES
Coordinates: 47°18′38″N, 119°54′42″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E