Air Miles Calculator logo

How far is Jinzhou from Nakashibetsu?

The distance between Nakashibetsu (Nakashibetsu Airport) and Jinzhou (Jinzhou Bay Airport) is 1231 miles / 1982 kilometers / 1070 nautical miles.

The driving distance from Nakashibetsu (SHB) to Jinzhou (JNZ) is 2380 miles / 3830 kilometers, and travel time by car is about 50 hours 32 minutes.

Nakashibetsu Airport – Jinzhou Bay Airport

Distance arrow
1231
Miles
Distance arrow
1982
Kilometers
Distance arrow
1070
Nautical miles

Search flights

Distance from Nakashibetsu to Jinzhou

There are several ways to calculate the distance from Nakashibetsu to Jinzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 1231.261 miles
  • 1981.523 kilometers
  • 1069.937 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1228.104 miles
  • 1976.442 kilometers
  • 1067.194 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakashibetsu to Jinzhou?

The estimated flight time from Nakashibetsu Airport to Jinzhou Bay Airport is 2 hours and 49 minutes.

Flight carbon footprint between Nakashibetsu Airport (SHB) and Jinzhou Bay Airport (JNZ)

On average, flying from Nakashibetsu to Jinzhou generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakashibetsu to Jinzhou

See the map of the shortest flight path between Nakashibetsu Airport (SHB) and Jinzhou Bay Airport (JNZ).

Airport information

Origin Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E
Destination Jinzhou Bay Airport
City: Jinzhou
Country: China Flag of China
IATA Code: JNZ
ICAO Code: ZYJZ
Coordinates: 41°6′5″N, 121°3′43″E