How far is Arctic Bay from Seattle, WA?
The distance between Seattle (Seattle–Tacoma International Airport) and Arctic Bay (Arctic Bay Airport) is 2111 miles / 3397 kilometers / 1834 nautical miles.
Seattle–Tacoma International Airport – Arctic Bay Airport
Search flights
Distance from Seattle to Arctic Bay
There are several ways to calculate the distance from Seattle to Arctic Bay. Here are two standard methods:
Vincenty's formula (applied above)- 2111.022 miles
- 3397.361 kilometers
- 1834.428 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2105.988 miles
- 3389.259 kilometers
- 1830.054 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Seattle to Arctic Bay?
The estimated flight time from Seattle–Tacoma International Airport to Arctic Bay Airport is 4 hours and 29 minutes.
What is the time difference between Seattle and Arctic Bay?
Flight carbon footprint between Seattle–Tacoma International Airport (SEA) and Arctic Bay Airport (YAB)
On average, flying from Seattle to Arctic Bay generates about 230 kg of CO2 per passenger, and 230 kilograms equals 507 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Seattle to Arctic Bay
See the map of the shortest flight path between Seattle–Tacoma International Airport (SEA) and Arctic Bay Airport (YAB).
Airport information
Origin | Seattle–Tacoma International Airport |
---|---|
City: | Seattle, WA |
Country: | United States |
IATA Code: | SEA |
ICAO Code: | KSEA |
Coordinates: | 47°26′56″N, 122°18′32″W |
Destination | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |