Air Miles Calculator logo

How far is Terrace from Reykjavik?

The distance between Reykjavik (Keflavík International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 3370 miles / 5424 kilometers / 2929 nautical miles.

Keflavík International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
3370
Miles
Distance arrow
5424
Kilometers
Distance arrow
2929
Nautical miles

Search flights

Distance from Reykjavik to Terrace

There are several ways to calculate the distance from Reykjavik to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 3370.259 miles
  • 5423.906 kilometers
  • 2928.675 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3358.481 miles
  • 5404.952 kilometers
  • 2918.441 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Reykjavik to Terrace?

The estimated flight time from Keflavík International Airport to Northwest Regional Airport Terrace-Kitimat is 6 hours and 52 minutes.

Flight carbon footprint between Keflavík International Airport (KEF) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from Reykjavik to Terrace generates about 379 kg of CO2 per passenger, and 379 kilograms equals 835 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Reykjavik to Terrace

See the map of the shortest flight path between Keflavík International Airport (KEF) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin Keflavík International Airport
City: Reykjavik
Country: Iceland Flag of Iceland
IATA Code: KEF
ICAO Code: BIKF
Coordinates: 63°59′6″N, 22°36′20″W
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W