How far is Terrace from Gods Lake Narrows?
The distance between Gods Lake Narrows (Gods Lake Narrows Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 1358 miles / 2186 kilometers / 1180 nautical miles.
Gods Lake Narrows Airport – Northwest Regional Airport Terrace-Kitimat
Search flights
Distance from Gods Lake Narrows to Terrace
There are several ways to calculate the distance from Gods Lake Narrows to Terrace. Here are two standard methods:
Vincenty's formula (applied above)- 1358.169 miles
- 2185.762 kilometers
- 1180.217 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1353.638 miles
- 2178.469 kilometers
- 1176.279 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Gods Lake Narrows to Terrace?
The estimated flight time from Gods Lake Narrows Airport to Northwest Regional Airport Terrace-Kitimat is 3 hours and 4 minutes.
What is the time difference between Gods Lake Narrows and Terrace?
Flight carbon footprint between Gods Lake Narrows Airport (YGO) and Northwest Regional Airport Terrace-Kitimat (YXT)
On average, flying from Gods Lake Narrows to Terrace generates about 171 kg of CO2 per passenger, and 171 kilograms equals 376 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Gods Lake Narrows to Terrace
See the map of the shortest flight path between Gods Lake Narrows Airport (YGO) and Northwest Regional Airport Terrace-Kitimat (YXT).
Airport information
Origin | Gods Lake Narrows Airport |
---|---|
City: | Gods Lake Narrows |
Country: | Canada |
IATA Code: | YGO |
ICAO Code: | CYGO |
Coordinates: | 54°33′32″N, 94°29′29″W |
Destination | Northwest Regional Airport Terrace-Kitimat |
---|---|
City: | Terrace |
Country: | Canada |
IATA Code: | YXT |
ICAO Code: | CYXT |
Coordinates: | 54°28′6″N, 128°34′33″W |