How far is Brandon from Guangzhou?
The distance between Guangzhou (Guangzhou Baiyun International Airport) and Brandon (Brandon Municipal Airport) is 6990 miles / 11249 kilometers / 6074 nautical miles.
Guangzhou Baiyun International Airport – Brandon Municipal Airport
Search flights
Distance from Guangzhou to Brandon
There are several ways to calculate the distance from Guangzhou to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 6989.974 miles
- 11249.273 kilometers
- 6074.122 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6977.394 miles
- 11229.027 kilometers
- 6063.190 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Guangzhou to Brandon?
The estimated flight time from Guangzhou Baiyun International Airport to Brandon Municipal Airport is 13 hours and 44 minutes.
What is the time difference between Guangzhou and Brandon?
The time difference between Guangzhou and Brandon is 14 hours. Brandon is 14 hours behind Guangzhou.
Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Brandon Municipal Airport (YBR)
On average, flying from Guangzhou to Brandon generates about 853 kg of CO2 per passenger, and 853 kilograms equals 1 881 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Guangzhou to Brandon
See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Brandon Municipal Airport (YBR).
Airport information
Origin | Guangzhou Baiyun International Airport |
---|---|
City: | Guangzhou |
Country: | China |
IATA Code: | CAN |
ICAO Code: | ZGGG |
Coordinates: | 23°23′32″N, 113°17′56″E |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |