Air Miles Calculator logo

How far is Terrace from Seoul?

The distance between Seoul (Seoul Gimpo International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 4689 miles / 7546 kilometers / 4074 nautical miles.

Seoul Gimpo International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
4689
Miles
Distance arrow
7546
Kilometers
Distance arrow
4074
Nautical miles

Search flights

Distance from Seoul to Terrace

There are several ways to calculate the distance from Seoul to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 4688.797 miles
  • 7545.887 kilometers
  • 4074.453 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4676.667 miles
  • 7526.366 kilometers
  • 4063.912 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Seoul to Terrace?

The estimated flight time from Seoul Gimpo International Airport to Northwest Regional Airport Terrace-Kitimat is 9 hours and 22 minutes.

Flight carbon footprint between Seoul Gimpo International Airport (GMP) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from Seoul to Terrace generates about 544 kg of CO2 per passenger, and 544 kilograms equals 1 198 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Seoul to Terrace

See the map of the shortest flight path between Seoul Gimpo International Airport (GMP) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin Seoul Gimpo International Airport
City: Seoul
Country: South Korea Flag of South Korea
IATA Code: GMP
ICAO Code: RKSS
Coordinates: 37°33′29″N, 126°47′27″E
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W