How far is Brandon from Everett, WA?
The distance between Everett (Paine Field) and Brandon (Brandon Municipal Airport) is 1023 miles / 1646 kilometers / 889 nautical miles.
The driving distance from Everett (PAE) to Brandon (YBR) is 1282 miles / 2063 kilometers, and travel time by car is about 24 hours 59 minutes.
Paine Field – Brandon Municipal Airport
Search flights
Distance from Everett to Brandon
There are several ways to calculate the distance from Everett to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 1022.628 miles
- 1645.760 kilometers
- 888.639 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1019.596 miles
- 1640.881 kilometers
- 886.005 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Everett to Brandon?
The estimated flight time from Paine Field to Brandon Municipal Airport is 2 hours and 26 minutes.
What is the time difference between Everett and Brandon?
The time difference between Everett and Brandon is 2 hours. Brandon is 2 hours ahead of Everett.
Flight carbon footprint between Paine Field (PAE) and Brandon Municipal Airport (YBR)
On average, flying from Everett to Brandon generates about 152 kg of CO2 per passenger, and 152 kilograms equals 336 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Everett to Brandon
See the map of the shortest flight path between Paine Field (PAE) and Brandon Municipal Airport (YBR).
Airport information
Origin | Paine Field |
---|---|
City: | Everett, WA |
Country: | United States |
IATA Code: | PAE |
ICAO Code: | KPAE |
Coordinates: | 47°54′22″N, 122°16′55″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |