Air Miles Calculator logo

How far is Bole from Namangan?

The distance between Namangan (Namangan Airport) and Bole (Alashankou Bole (Bortala) airport) is 607 miles / 977 kilometers / 528 nautical miles.

The driving distance from Namangan (NMA) to Bole (BPL) is 826 miles / 1330 kilometers, and travel time by car is about 18 hours 19 minutes.

Namangan Airport – Alashankou Bole (Bortala) airport

Distance arrow
607
Miles
Distance arrow
977
Kilometers
Distance arrow
528
Nautical miles

Search flights

Distance from Namangan to Bole

There are several ways to calculate the distance from Namangan to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 607.388 miles
  • 977.496 kilometers
  • 527.806 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 606.199 miles
  • 975.583 kilometers
  • 526.773 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Namangan to Bole?

The estimated flight time from Namangan Airport to Alashankou Bole (Bortala) airport is 1 hour and 38 minutes.

Flight carbon footprint between Namangan Airport (NMA) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Namangan to Bole generates about 114 kg of CO2 per passenger, and 114 kilograms equals 251 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Namangan to Bole

See the map of the shortest flight path between Namangan Airport (NMA) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Namangan Airport
City: Namangan
Country: Uzbekistan Flag of Uzbekistan
IATA Code: NMA
ICAO Code: UTKN
Coordinates: 40°59′4″N, 71°33′24″E
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E