Air Miles Calculator logo

How far is Terrace from Harbin?

The distance between Harbin (Harbin Taiping International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 4270 miles / 6873 kilometers / 3711 nautical miles.

Harbin Taiping International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
4270
Miles
Distance arrow
6873
Kilometers
Distance arrow
3711
Nautical miles

Search flights

Distance from Harbin to Terrace

There are several ways to calculate the distance from Harbin to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 4270.468 miles
  • 6872.652 kilometers
  • 3710.935 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4257.697 miles
  • 6852.099 kilometers
  • 3699.838 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Harbin to Terrace?

The estimated flight time from Harbin Taiping International Airport to Northwest Regional Airport Terrace-Kitimat is 8 hours and 35 minutes.

Flight carbon footprint between Harbin Taiping International Airport (HRB) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from Harbin to Terrace generates about 490 kg of CO2 per passenger, and 490 kilograms equals 1 081 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Harbin to Terrace

See the map of the shortest flight path between Harbin Taiping International Airport (HRB) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W