Air Miles Calculator logo

How far is Nakashibetsu from Hefei?

The distance between Hefei (Hefei Luogang Airport) and Nakashibetsu (Nakashibetsu Airport) is 1709 miles / 2751 kilometers / 1485 nautical miles.

The driving distance from Hefei (HFE) to Nakashibetsu (SHB) is 3202 miles / 5153 kilometers, and travel time by car is about 65 hours 25 minutes.

Hefei Luogang Airport – Nakashibetsu Airport

Distance arrow
1709
Miles
Distance arrow
2751
Kilometers
Distance arrow
1485
Nautical miles

Search flights

Distance from Hefei to Nakashibetsu

There are several ways to calculate the distance from Hefei to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1709.209 miles
  • 2750.705 kilometers
  • 1485.262 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1706.801 miles
  • 2746.830 kilometers
  • 1483.169 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hefei to Nakashibetsu?

The estimated flight time from Hefei Luogang Airport to Nakashibetsu Airport is 3 hours and 44 minutes.

Flight carbon footprint between Hefei Luogang Airport (HFE) and Nakashibetsu Airport (SHB)

On average, flying from Hefei to Nakashibetsu generates about 193 kg of CO2 per passenger, and 193 kilograms equals 426 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hefei to Nakashibetsu

See the map of the shortest flight path between Hefei Luogang Airport (HFE) and Nakashibetsu Airport (SHB).

Airport information

Origin Hefei Luogang Airport
City: Hefei
Country: China Flag of China
IATA Code: HFE
ICAO Code: ZSOF
Coordinates: 31°46′48″N, 117°17′52″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E