Air Miles Calculator logo

How far is Samjiyon from Iwakuni?

The distance between Iwakuni (Marine Corps Air Station Iwakuni) and Samjiyon (Samjiyon Airport) is 575 miles / 925 kilometers / 499 nautical miles.

The driving distance from Iwakuni (IWK) to Samjiyon (YJS) is 948 miles / 1526 kilometers, and travel time by car is about 23 hours 19 minutes.

Marine Corps Air Station Iwakuni – Samjiyon Airport

Distance arrow
575
Miles
Distance arrow
925
Kilometers
Distance arrow
499
Nautical miles

Search flights

Distance from Iwakuni to Samjiyon

There are several ways to calculate the distance from Iwakuni to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 574.517 miles
  • 924.595 kilometers
  • 499.241 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 575.226 miles
  • 925.737 kilometers
  • 499.858 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iwakuni to Samjiyon?

The estimated flight time from Marine Corps Air Station Iwakuni to Samjiyon Airport is 1 hour and 35 minutes.

Flight carbon footprint between Marine Corps Air Station Iwakuni (IWK) and Samjiyon Airport (YJS)

On average, flying from Iwakuni to Samjiyon generates about 109 kg of CO2 per passenger, and 109 kilograms equals 241 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Iwakuni to Samjiyon

See the map of the shortest flight path between Marine Corps Air Station Iwakuni (IWK) and Samjiyon Airport (YJS).

Airport information

Origin Marine Corps Air Station Iwakuni
City: Iwakuni
Country: Japan Flag of Japan
IATA Code: IWK
ICAO Code: RJOI
Coordinates: 34°8′38″N, 132°14′9″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E