Air Miles Calculator logo

How far is Wakkanai from Bole?

The distance between Bole (Alashankou Bole (Bortala) airport) and Wakkanai (Wakkanai Airport) is 2839 miles / 4569 kilometers / 2467 nautical miles.

The driving distance from Bole (BPL) to Wakkanai (WKJ) is 4653 miles / 7488 kilometers, and travel time by car is about 91 hours 1 minutes.

Alashankou Bole (Bortala) airport – Wakkanai Airport

Distance arrow
2839
Miles
Distance arrow
4569
Kilometers
Distance arrow
2467
Nautical miles

Search flights

Distance from Bole to Wakkanai

There are several ways to calculate the distance from Bole to Wakkanai. Here are two standard methods:

Vincenty's formula (applied above)
  • 2838.904 miles
  • 4568.773 kilometers
  • 2466.940 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2830.970 miles
  • 4556.004 kilometers
  • 2460.046 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bole to Wakkanai?

The estimated flight time from Alashankou Bole (Bortala) airport to Wakkanai Airport is 5 hours and 52 minutes.

Flight carbon footprint between Alashankou Bole (Bortala) airport (BPL) and Wakkanai Airport (WKJ)

On average, flying from Bole to Wakkanai generates about 315 kg of CO2 per passenger, and 315 kilograms equals 695 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bole to Wakkanai

See the map of the shortest flight path between Alashankou Bole (Bortala) airport (BPL) and Wakkanai Airport (WKJ).

Airport information

Origin Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E
Destination Wakkanai Airport
City: Wakkanai
Country: Japan Flag of Japan
IATA Code: WKJ
ICAO Code: RJCW
Coordinates: 45°24′15″N, 141°48′3″E