Air Miles Calculator logo

How far is Samjiyon from Shirahama?

The distance between Shirahama (Nanki–Shirahama Airport) and Samjiyon (Samjiyon Airport) is 684 miles / 1100 kilometers / 594 nautical miles.

The driving distance from Shirahama (SHM) to Samjiyon (YJS) is 1267 miles / 2039 kilometers, and travel time by car is about 29 hours 31 minutes.

Nanki–Shirahama Airport – Samjiyon Airport

Distance arrow
684
Miles
Distance arrow
1100
Kilometers
Distance arrow
594
Nautical miles

Search flights

Distance from Shirahama to Samjiyon

There are several ways to calculate the distance from Shirahama to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 683.713 miles
  • 1100.330 kilometers
  • 594.131 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 684.076 miles
  • 1100.914 kilometers
  • 594.446 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shirahama to Samjiyon?

The estimated flight time from Nanki–Shirahama Airport to Samjiyon Airport is 1 hour and 47 minutes.

Flight carbon footprint between Nanki–Shirahama Airport (SHM) and Samjiyon Airport (YJS)

On average, flying from Shirahama to Samjiyon generates about 123 kg of CO2 per passenger, and 123 kilograms equals 271 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shirahama to Samjiyon

See the map of the shortest flight path between Nanki–Shirahama Airport (SHM) and Samjiyon Airport (YJS).

Airport information

Origin Nanki–Shirahama Airport
City: Shirahama
Country: Japan Flag of Japan
IATA Code: SHM
ICAO Code: RJBD
Coordinates: 33°39′43″N, 135°21′50″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E