Air Miles Calculator logo

How far is Bole from Syracuse, NY?

The distance between Syracuse (Syracuse Hancock International Airport) and Bole (Alashankou Bole (Bortala) airport) is 6229 miles / 10024 kilometers / 5412 nautical miles.

Syracuse Hancock International Airport – Alashankou Bole (Bortala) airport

Distance arrow
6229
Miles
Distance arrow
10024
Kilometers
Distance arrow
5412
Nautical miles

Search flights

Distance from Syracuse to Bole

There are several ways to calculate the distance from Syracuse to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 6228.559 miles
  • 10023.894 kilometers
  • 5412.470 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6212.454 miles
  • 9997.975 kilometers
  • 5398.475 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Syracuse to Bole?

The estimated flight time from Syracuse Hancock International Airport to Alashankou Bole (Bortala) airport is 12 hours and 17 minutes.

Flight carbon footprint between Syracuse Hancock International Airport (SYR) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Syracuse to Bole generates about 748 kg of CO2 per passenger, and 748 kilograms equals 1 649 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Syracuse to Bole

See the map of the shortest flight path between Syracuse Hancock International Airport (SYR) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Syracuse Hancock International Airport
City: Syracuse, NY
Country: United States Flag of United States
IATA Code: SYR
ICAO Code: KSYR
Coordinates: 43°6′40″N, 76°6′22″W
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E