Air Miles Calculator logo

How far is Brandon from Pickle Lake?

The distance between Pickle Lake (Pickle Lake Airport) and Brandon (Brandon Municipal Airport) is 440 miles / 709 kilometers / 383 nautical miles.

The driving distance from Pickle Lake (YPL) to Brandon (YBR) is 577 miles / 928 kilometers, and travel time by car is about 13 hours 48 minutes.

Pickle Lake Airport – Brandon Municipal Airport

Distance arrow
440
Miles
Distance arrow
709
Kilometers
Distance arrow
383
Nautical miles

Search flights

Distance from Pickle Lake to Brandon

There are several ways to calculate the distance from Pickle Lake to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 440.299 miles
  • 708.593 kilometers
  • 382.609 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 438.994 miles
  • 706.493 kilometers
  • 381.476 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pickle Lake to Brandon?

The estimated flight time from Pickle Lake Airport to Brandon Municipal Airport is 1 hour and 20 minutes.

Flight carbon footprint between Pickle Lake Airport (YPL) and Brandon Municipal Airport (YBR)

On average, flying from Pickle Lake to Brandon generates about 90 kg of CO2 per passenger, and 90 kilograms equals 198 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Pickle Lake to Brandon

See the map of the shortest flight path between Pickle Lake Airport (YPL) and Brandon Municipal Airport (YBR).

Airport information

Origin Pickle Lake Airport
City: Pickle Lake
Country: Canada Flag of Canada
IATA Code: YPL
ICAO Code: CYPL
Coordinates: 51°26′47″N, 90°12′51″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W