Air Miles Calculator logo

How far is Bole from Ottawa?

The distance between Ottawa (Ottawa Macdonald–Cartier International Airport) and Bole (Alashankou Bole (Bortala) airport) is 6076 miles / 9778 kilometers / 5280 nautical miles.

Ottawa Macdonald–Cartier International Airport – Alashankou Bole (Bortala) airport

Distance arrow
6076
Miles
Distance arrow
9778
Kilometers
Distance arrow
5280
Nautical miles

Search flights

Distance from Ottawa to Bole

There are several ways to calculate the distance from Ottawa to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 6075.549 miles
  • 9777.649 kilometers
  • 5279.508 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6059.360 miles
  • 9751.595 kilometers
  • 5265.440 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ottawa to Bole?

The estimated flight time from Ottawa Macdonald–Cartier International Airport to Alashankou Bole (Bortala) airport is 12 hours and 0 minutes.

Flight carbon footprint between Ottawa Macdonald–Cartier International Airport (YOW) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Ottawa to Bole generates about 727 kg of CO2 per passenger, and 727 kilograms equals 1 603 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ottawa to Bole

See the map of the shortest flight path between Ottawa Macdonald–Cartier International Airport (YOW) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Ottawa Macdonald–Cartier International Airport
City: Ottawa
Country: Canada Flag of Canada
IATA Code: YOW
ICAO Code: CYOW
Coordinates: 45°19′20″N, 75°40′9″W
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E