Air Miles Calculator logo

How far is Anshan from Nakashibetsu?

The distance between Nakashibetsu (Nakashibetsu Airport) and Anshan (Anshan Teng'ao Airport) is 1141 miles / 1837 kilometers / 992 nautical miles.

The driving distance from Nakashibetsu (SHB) to Anshan (AOG) is 2293 miles / 3691 kilometers, and travel time by car is about 49 hours 4 minutes.

Nakashibetsu Airport – Anshan Teng'ao Airport

Distance arrow
1141
Miles
Distance arrow
1837
Kilometers
Distance arrow
992
Nautical miles

Search flights

Distance from Nakashibetsu to Anshan

There are several ways to calculate the distance from Nakashibetsu to Anshan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1141.250 miles
  • 1836.664 kilometers
  • 991.719 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1138.337 miles
  • 1831.975 kilometers
  • 989.188 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakashibetsu to Anshan?

The estimated flight time from Nakashibetsu Airport to Anshan Teng'ao Airport is 2 hours and 39 minutes.

Flight carbon footprint between Nakashibetsu Airport (SHB) and Anshan Teng'ao Airport (AOG)

On average, flying from Nakashibetsu to Anshan generates about 159 kg of CO2 per passenger, and 159 kilograms equals 350 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakashibetsu to Anshan

See the map of the shortest flight path between Nakashibetsu Airport (SHB) and Anshan Teng'ao Airport (AOG).

Airport information

Origin Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E
Destination Anshan Teng'ao Airport
City: Anshan
Country: China Flag of China
IATA Code: AOG
ICAO Code: ZYAS
Coordinates: 41°6′19″N, 122°51′14″E