How far is Brandon from Beijing?
The distance between Beijing (Beijing Capital International Airport) and Brandon (Brandon Municipal Airport) is 5851 miles / 9415 kilometers / 5084 nautical miles.
Beijing Capital International Airport – Brandon Municipal Airport
Search flights
Distance from Beijing to Brandon
There are several ways to calculate the distance from Beijing to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 5850.520 miles
- 9415.500 kilometers
- 5083.963 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5835.136 miles
- 9390.742 kilometers
- 5070.595 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Brandon?
The estimated flight time from Beijing Capital International Airport to Brandon Municipal Airport is 11 hours and 34 minutes.
What is the time difference between Beijing and Brandon?
The time difference between Beijing and Brandon is 14 hours. Brandon is 14 hours behind Beijing.
Flight carbon footprint between Beijing Capital International Airport (PEK) and Brandon Municipal Airport (YBR)
On average, flying from Beijing to Brandon generates about 696 kg of CO2 per passenger, and 696 kilograms equals 1 535 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Beijing to Brandon
See the map of the shortest flight path between Beijing Capital International Airport (PEK) and Brandon Municipal Airport (YBR).
Airport information
Origin | Beijing Capital International Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | PEK |
ICAO Code: | ZBAA |
Coordinates: | 40°4′48″N, 116°35′5″E |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |