Air Miles Calculator logo

How far is Bayanhot from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Bayanhot (Alxa Left Banner Bayanhot Airport) is 6748 miles / 10860 kilometers / 5864 nautical miles.

Toronto Pearson International Airport – Alxa Left Banner Bayanhot Airport

Distance arrow
6748
Miles
Distance arrow
10860
Kilometers
Distance arrow
5864
Nautical miles

Search flights

Distance from Toronto to Bayanhot

There are several ways to calculate the distance from Toronto to Bayanhot. Here are two standard methods:

Vincenty's formula (applied above)
  • 6748.318 miles
  • 10860.366 kilometers
  • 5864.128 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6732.417 miles
  • 10834.775 kilometers
  • 5850.311 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Bayanhot?

The estimated flight time from Toronto Pearson International Airport to Alxa Left Banner Bayanhot Airport is 13 hours and 16 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Alxa Left Banner Bayanhot Airport (AXF)

On average, flying from Toronto to Bayanhot generates about 820 kg of CO2 per passenger, and 820 kilograms equals 1 807 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Bayanhot

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Alxa Left Banner Bayanhot Airport (AXF).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Alxa Left Banner Bayanhot Airport
City: Bayanhot
Country: China Flag of China
IATA Code: AXF
ICAO Code: ZBAL
Coordinates: 38°44′53″N, 105°35′18″E