Air Miles Calculator logo

How far is Harbin from Samjiyon?

The distance between Samjiyon (Samjiyon Airport) and Harbin (Harbin Taiping International Airport) is 278 miles / 448 kilometers / 242 nautical miles.

The driving distance from Samjiyon (YJS) to Harbin (HRB) is 380 miles / 611 kilometers, and travel time by car is about 7 hours 59 minutes.

Samjiyon Airport – Harbin Taiping International Airport

Distance arrow
278
Miles
Distance arrow
448
Kilometers
Distance arrow
242
Nautical miles

Search flights

Distance from Samjiyon to Harbin

There are several ways to calculate the distance from Samjiyon to Harbin. Here are two standard methods:

Vincenty's formula (applied above)
  • 278.369 miles
  • 447.991 kilometers
  • 241.896 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 278.441 miles
  • 448.107 kilometers
  • 241.959 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Samjiyon to Harbin?

The estimated flight time from Samjiyon Airport to Harbin Taiping International Airport is 1 hour and 1 minutes.

What is the time difference between Samjiyon and Harbin?

There is no time difference between Samjiyon and Harbin.

Flight carbon footprint between Samjiyon Airport (YJS) and Harbin Taiping International Airport (HRB)

On average, flying from Samjiyon to Harbin generates about 66 kg of CO2 per passenger, and 66 kilograms equals 145 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Samjiyon to Harbin

See the map of the shortest flight path between Samjiyon Airport (YJS) and Harbin Taiping International Airport (HRB).

Airport information

Origin Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E
Destination Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E