Air Miles Calculator logo

How far is Bole from Phoenix, AZ?

The distance between Phoenix (Phoenix Sky Harbor International Airport) and Bole (Alashankou Bole (Bortala) airport) is 6966 miles / 11210 kilometers / 6053 nautical miles.

Phoenix Sky Harbor International Airport – Alashankou Bole (Bortala) airport

Distance arrow
6966
Miles
Distance arrow
11210
Kilometers
Distance arrow
6053
Nautical miles

Search flights

Distance from Phoenix to Bole

There are several ways to calculate the distance from Phoenix to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 6965.876 miles
  • 11210.491 kilometers
  • 6053.181 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6950.748 miles
  • 11186.144 kilometers
  • 6040.035 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phoenix to Bole?

The estimated flight time from Phoenix Sky Harbor International Airport to Alashankou Bole (Bortala) airport is 13 hours and 41 minutes.

Flight carbon footprint between Phoenix Sky Harbor International Airport (PHX) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Phoenix to Bole generates about 850 kg of CO2 per passenger, and 850 kilograms equals 1 874 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Phoenix to Bole

See the map of the shortest flight path between Phoenix Sky Harbor International Airport (PHX) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Phoenix Sky Harbor International Airport
City: Phoenix, AZ
Country: United States Flag of United States
IATA Code: PHX
ICAO Code: KPHX
Coordinates: 33°26′3″N, 112°0′43″W
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E