Air Miles Calculator logo

How far is Nakashibetsu from Fuyun?

The distance between Fuyun (Fuyun Koktokay Airport) and Nakashibetsu (Nakashibetsu Airport) is 2660 miles / 4281 kilometers / 2312 nautical miles.

The driving distance from Fuyun (FYN) to Nakashibetsu (SHB) is 4452 miles / 7165 kilometers, and travel time by car is about 97 hours 57 minutes.

Fuyun Koktokay Airport – Nakashibetsu Airport

Distance arrow
2660
Miles
Distance arrow
4281
Kilometers
Distance arrow
2312
Nautical miles

Search flights

Distance from Fuyun to Nakashibetsu

There are several ways to calculate the distance from Fuyun to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 2660.039 miles
  • 4280.917 kilometers
  • 2311.510 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2652.659 miles
  • 4269.040 kilometers
  • 2305.097 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fuyun to Nakashibetsu?

The estimated flight time from Fuyun Koktokay Airport to Nakashibetsu Airport is 5 hours and 32 minutes.

Flight carbon footprint between Fuyun Koktokay Airport (FYN) and Nakashibetsu Airport (SHB)

On average, flying from Fuyun to Nakashibetsu generates about 294 kg of CO2 per passenger, and 294 kilograms equals 648 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fuyun to Nakashibetsu

See the map of the shortest flight path between Fuyun Koktokay Airport (FYN) and Nakashibetsu Airport (SHB).

Airport information

Origin Fuyun Koktokay Airport
City: Fuyun
Country: China Flag of China
IATA Code: FYN
ICAO Code: ZWFY
Coordinates: 46°48′15″N, 89°30′43″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E