How far is Brandon from Nanaimo?
The distance between Nanaimo (Nanaimo Airport) and Brandon (Brandon Municipal Airport) is 1074 miles / 1728 kilometers / 933 nautical miles.
The driving distance from Nanaimo (YCD) to Brandon (YBR) is 1356 miles / 2183 kilometers, and travel time by car is about 27 hours 52 minutes.
Nanaimo Airport – Brandon Municipal Airport
Search flights
Distance from Nanaimo to Brandon
There are several ways to calculate the distance from Nanaimo to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 1074.006 miles
- 1728.445 kilometers
- 933.286 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1070.738 miles
- 1723.186 kilometers
- 930.446 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nanaimo to Brandon?
The estimated flight time from Nanaimo Airport to Brandon Municipal Airport is 2 hours and 32 minutes.
What is the time difference between Nanaimo and Brandon?
The time difference between Nanaimo and Brandon is 2 hours. Brandon is 2 hours ahead of Nanaimo.
Flight carbon footprint between Nanaimo Airport (YCD) and Brandon Municipal Airport (YBR)
On average, flying from Nanaimo to Brandon generates about 155 kg of CO2 per passenger, and 155 kilograms equals 342 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nanaimo to Brandon
See the map of the shortest flight path between Nanaimo Airport (YCD) and Brandon Municipal Airport (YBR).
Airport information
Origin | Nanaimo Airport |
---|---|
City: | Nanaimo |
Country: | Canada |
IATA Code: | YCD |
ICAO Code: | CYCD |
Coordinates: | 49°3′8″N, 123°52′12″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |