How far is Burlington, IA, from Everett, WA?
The distance between Everett (Paine Field) and Burlington (Southeast Iowa Regional Airport) is 1608 miles / 2587 kilometers / 1397 nautical miles.
The driving distance from Everett (PAE) to Burlington (BRL) is 1944 miles / 3129 kilometers, and travel time by car is about 34 hours 36 minutes.
Paine Field – Southeast Iowa Regional Airport
Search flights
Distance from Everett to Burlington
There are several ways to calculate the distance from Everett to Burlington. Here are two standard methods:
Vincenty's formula (applied above)- 1607.545 miles
- 2587.093 kilometers
- 1396.919 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1603.645 miles
- 2580.817 kilometers
- 1393.529 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Everett to Burlington?
The estimated flight time from Paine Field to Southeast Iowa Regional Airport is 3 hours and 32 minutes.
What is the time difference between Everett and Burlington?
Flight carbon footprint between Paine Field (PAE) and Southeast Iowa Regional Airport (BRL)
On average, flying from Everett to Burlington generates about 186 kg of CO2 per passenger, and 186 kilograms equals 411 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Everett to Burlington
See the map of the shortest flight path between Paine Field (PAE) and Southeast Iowa Regional Airport (BRL).
Airport information
Origin | Paine Field |
---|---|
City: | Everett, WA |
Country: | United States |
IATA Code: | PAE |
ICAO Code: | KPAE |
Coordinates: | 47°54′22″N, 122°16′55″W |
Destination | Southeast Iowa Regional Airport |
---|---|
City: | Burlington, IA |
Country: | United States |
IATA Code: | BRL |
ICAO Code: | KBRL |
Coordinates: | 40°46′59″N, 91°7′31″W |