How far is Brandon from Nakina?
The distance between Nakina (Nakina Airport) and Brandon (Brandon Municipal Airport) is 589 miles / 949 kilometers / 512 nautical miles.
The driving distance from Nakina (YQN) to Brandon (YBR) is 771 miles / 1241 kilometers, and travel time by car is about 17 hours 12 minutes.
Nakina Airport – Brandon Municipal Airport
Search flights
Distance from Nakina to Brandon
There are several ways to calculate the distance from Nakina to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 589.484 miles
- 948.682 kilometers
- 512.247 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 587.667 miles
- 945.759 kilometers
- 510.669 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nakina to Brandon?
The estimated flight time from Nakina Airport to Brandon Municipal Airport is 1 hour and 36 minutes.
What is the time difference between Nakina and Brandon?
The time difference between Nakina and Brandon is 1 hour. Brandon is 1 hour behind Nakina.
Flight carbon footprint between Nakina Airport (YQN) and Brandon Municipal Airport (YBR)
On average, flying from Nakina to Brandon generates about 111 kg of CO2 per passenger, and 111 kilograms equals 245 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nakina to Brandon
See the map of the shortest flight path between Nakina Airport (YQN) and Brandon Municipal Airport (YBR).
Airport information
Origin | Nakina Airport |
---|---|
City: | Nakina |
Country: | Canada |
IATA Code: | YQN |
ICAO Code: | CYQN |
Coordinates: | 50°10′58″N, 86°41′47″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |