How far is Terrace from North Spirit Lake?
The distance between North Spirit Lake (North Spirit Lake Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 1459 miles / 2349 kilometers / 1268 nautical miles.
North Spirit Lake Airport – Northwest Regional Airport Terrace-Kitimat
Search flights
Distance from North Spirit Lake to Terrace
There are several ways to calculate the distance from North Spirit Lake to Terrace. Here are two standard methods:
Vincenty's formula (applied above)- 1459.366 miles
- 2348.622 kilometers
- 1268.154 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1454.612 miles
- 2340.971 kilometers
- 1264.023 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from North Spirit Lake to Terrace?
The estimated flight time from North Spirit Lake Airport to Northwest Regional Airport Terrace-Kitimat is 3 hours and 15 minutes.
What is the time difference between North Spirit Lake and Terrace?
Flight carbon footprint between North Spirit Lake Airport (YNO) and Northwest Regional Airport Terrace-Kitimat (YXT)
On average, flying from North Spirit Lake to Terrace generates about 177 kg of CO2 per passenger, and 177 kilograms equals 390 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from North Spirit Lake to Terrace
See the map of the shortest flight path between North Spirit Lake Airport (YNO) and Northwest Regional Airport Terrace-Kitimat (YXT).
Airport information
Origin | North Spirit Lake Airport |
---|---|
City: | North Spirit Lake |
Country: | Canada |
IATA Code: | YNO |
ICAO Code: | CKQ3 |
Coordinates: | 52°29′24″N, 92°58′15″W |
Destination | Northwest Regional Airport Terrace-Kitimat |
---|---|
City: | Terrace |
Country: | Canada |
IATA Code: | YXT |
ICAO Code: | CYXT |
Coordinates: | 54°28′6″N, 128°34′33″W |