Air Miles Calculator logo

How far is San Antonio, TX, from Terrace?

The distance between Terrace (Northwest Regional Airport Terrace-Kitimat) and San Antonio (San Antonio International Airport) is 2282 miles / 3672 kilometers / 1983 nautical miles.

The driving distance from Terrace (YXT) to San Antonio (SAT) is 2856 miles / 4596 kilometers, and travel time by car is about 55 hours 16 minutes.

Northwest Regional Airport Terrace-Kitimat – San Antonio International Airport

Distance arrow
2282
Miles
Distance arrow
3672
Kilometers
Distance arrow
1983
Nautical miles

Search flights

Distance from Terrace to San Antonio

There are several ways to calculate the distance from Terrace to San Antonio. Here are two standard methods:

Vincenty's formula (applied above)
  • 2281.898 miles
  • 3672.359 kilometers
  • 1982.915 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2280.770 miles
  • 3670.544 kilometers
  • 1981.935 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Terrace to San Antonio?

The estimated flight time from Northwest Regional Airport Terrace-Kitimat to San Antonio International Airport is 4 hours and 49 minutes.

Flight carbon footprint between Northwest Regional Airport Terrace-Kitimat (YXT) and San Antonio International Airport (SAT)

On average, flying from Terrace to San Antonio generates about 250 kg of CO2 per passenger, and 250 kilograms equals 551 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Terrace to San Antonio

See the map of the shortest flight path between Northwest Regional Airport Terrace-Kitimat (YXT) and San Antonio International Airport (SAT).

Airport information

Origin Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W
Destination San Antonio International Airport
City: San Antonio, TX
Country: United States Flag of United States
IATA Code: SAT
ICAO Code: KSAT
Coordinates: 29°32′1″N, 98°28′11″W