How far is Terrace from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 1778 miles / 2862 kilometers / 1545 nautical miles.
Arctic Bay Airport – Northwest Regional Airport Terrace-Kitimat
Search flights
Distance from Arctic Bay to Terrace
There are several ways to calculate the distance from Arctic Bay to Terrace. Here are two standard methods:
Vincenty's formula (applied above)- 1778.184 miles
- 2861.710 kilometers
- 1545.200 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1772.733 miles
- 2852.937 kilometers
- 1540.463 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Terrace?
The estimated flight time from Arctic Bay Airport to Northwest Regional Airport Terrace-Kitimat is 3 hours and 52 minutes.
What is the time difference between Arctic Bay and Terrace?
The time difference between Arctic Bay and Terrace is 2 hours. Terrace is 2 hours behind Arctic Bay.
Flight carbon footprint between Arctic Bay Airport (YAB) and Northwest Regional Airport Terrace-Kitimat (YXT)
On average, flying from Arctic Bay to Terrace generates about 198 kg of CO2 per passenger, and 198 kilograms equals 437 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Terrace
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Northwest Regional Airport Terrace-Kitimat (YXT).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Northwest Regional Airport Terrace-Kitimat |
---|---|
City: | Terrace |
Country: | Canada |
IATA Code: | YXT |
ICAO Code: | CYXT |
Coordinates: | 54°28′6″N, 128°34′33″W |