Air Miles Calculator logo

How far is Nakashibetsu from Burqin?

The distance between Burqin (Burqin Kanas Airport) and Nakashibetsu (Nakashibetsu Airport) is 2746 miles / 4420 kilometers / 2387 nautical miles.

The driving distance from Burqin (KJI) to Nakashibetsu (SHB) is 4729 miles / 7610 kilometers, and travel time by car is about 93 hours 17 minutes.

Burqin Kanas Airport – Nakashibetsu Airport

Distance arrow
2746
Miles
Distance arrow
4420
Kilometers
Distance arrow
2387
Nautical miles

Search flights

Distance from Burqin to Nakashibetsu

There are several ways to calculate the distance from Burqin to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2746.425 miles
  • 4419.942 kilometers
  • 2386.578 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2738.753 miles
  • 4407.596 kilometers
  • 2379.912 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Burqin to Nakashibetsu?

The estimated flight time from Burqin Kanas Airport to Nakashibetsu Airport is 5 hours and 41 minutes.

Flight carbon footprint between Burqin Kanas Airport (KJI) and Nakashibetsu Airport (SHB)

On average, flying from Burqin to Nakashibetsu generates about 304 kg of CO2 per passenger, and 304 kilograms equals 670 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Burqin to Nakashibetsu

See the map of the shortest flight path between Burqin Kanas Airport (KJI) and Nakashibetsu Airport (SHB).

Airport information

Origin Burqin Kanas Airport
City: Burqin
Country: China Flag of China
IATA Code: KJI
ICAO Code: ZWKN
Coordinates: 48°13′20″N, 86°59′45″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E