Air Miles Calculator logo

How far is Nakashibetsu from Jining?

The distance between Jining (Jining Qufu Airport) and Nakashibetsu (Nakashibetsu Airport) is 1624 miles / 2614 kilometers / 1411 nautical miles.

The driving distance from Jining (JNG) to Nakashibetsu (SHB) is 2580 miles / 4152 kilometers, and travel time by car is about 63 hours 54 minutes.

Jining Qufu Airport – Nakashibetsu Airport

Distance arrow
1624
Miles
Distance arrow
2614
Kilometers
Distance arrow
1411
Nautical miles

Search flights

Distance from Jining to Nakashibetsu

There are several ways to calculate the distance from Jining to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1624.009 miles
  • 2613.590 kilometers
  • 1411.226 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1620.815 miles
  • 2608.449 kilometers
  • 1408.450 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jining to Nakashibetsu?

The estimated flight time from Jining Qufu Airport to Nakashibetsu Airport is 3 hours and 34 minutes.

Flight carbon footprint between Jining Qufu Airport (JNG) and Nakashibetsu Airport (SHB)

On average, flying from Jining to Nakashibetsu generates about 187 kg of CO2 per passenger, and 187 kilograms equals 413 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jining to Nakashibetsu

See the map of the shortest flight path between Jining Qufu Airport (JNG) and Nakashibetsu Airport (SHB).

Airport information

Origin Jining Qufu Airport
City: Jining
Country: China Flag of China
IATA Code: JNG
ICAO Code: ZSJG
Coordinates: 35°17′34″N, 116°20′48″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E