Air Miles Calculator logo

How far is Bavannur from Ottawa?

The distance between Ottawa (Ottawa Macdonald–Cartier International Airport) and Bavannur (Bayannur Tianjitai Airport) is 6490 miles / 10445 kilometers / 5640 nautical miles.

Ottawa Macdonald–Cartier International Airport – Bayannur Tianjitai Airport

Distance arrow
6490
Miles
Distance arrow
10445
Kilometers
Distance arrow
5640
Nautical miles

Search flights

Distance from Ottawa to Bavannur

There are several ways to calculate the distance from Ottawa to Bavannur. Here are two standard methods:

Vincenty's formula (applied above)
  • 6490.079 miles
  • 10444.769 kilometers
  • 5639.724 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6473.876 miles
  • 10418.694 kilometers
  • 5625.644 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ottawa to Bavannur?

The estimated flight time from Ottawa Macdonald–Cartier International Airport to Bayannur Tianjitai Airport is 12 hours and 47 minutes.

Flight carbon footprint between Ottawa Macdonald–Cartier International Airport (YOW) and Bayannur Tianjitai Airport (RLK)

On average, flying from Ottawa to Bavannur generates about 784 kg of CO2 per passenger, and 784 kilograms equals 1 728 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ottawa to Bavannur

See the map of the shortest flight path between Ottawa Macdonald–Cartier International Airport (YOW) and Bayannur Tianjitai Airport (RLK).

Airport information

Origin Ottawa Macdonald–Cartier International Airport
City: Ottawa
Country: Canada Flag of Canada
IATA Code: YOW
ICAO Code: CYOW
Coordinates: 45°19′20″N, 75°40′9″W
Destination Bayannur Tianjitai Airport
City: Bavannur
Country: China Flag of China
IATA Code: RLK
ICAO Code: ZBYZ
Coordinates: 40°55′33″N, 107°44′34″E