Air Miles Calculator logo

How far is Blagoveschensk from Samjiyon?

The distance between Samjiyon (Samjiyon Airport) and Blagoveschensk (Ignatyevo Airport) is 590 miles / 950 kilometers / 513 nautical miles.

The driving distance from Samjiyon (YJS) to Blagoveschensk (BQS) is 749 miles / 1205 kilometers, and travel time by car is about 14 hours 48 minutes.

Samjiyon Airport – Ignatyevo Airport

Distance arrow
590
Miles
Distance arrow
950
Kilometers
Distance arrow
513
Nautical miles

Search flights

Distance from Samjiyon to Blagoveschensk

There are several ways to calculate the distance from Samjiyon to Blagoveschensk. Here are two standard methods:

Vincenty's formula (applied above)
  • 590.272 miles
  • 949.950 kilometers
  • 512.932 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 590.475 miles
  • 950.277 kilometers
  • 513.109 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Samjiyon to Blagoveschensk?

The estimated flight time from Samjiyon Airport to Ignatyevo Airport is 1 hour and 37 minutes.

Flight carbon footprint between Samjiyon Airport (YJS) and Ignatyevo Airport (BQS)

On average, flying from Samjiyon to Blagoveschensk generates about 111 kg of CO2 per passenger, and 111 kilograms equals 246 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Samjiyon to Blagoveschensk

See the map of the shortest flight path between Samjiyon Airport (YJS) and Ignatyevo Airport (BQS).

Airport information

Origin Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E
Destination Ignatyevo Airport
City: Blagoveschensk
Country: Russia Flag of Russia
IATA Code: BQS
ICAO Code: UHBB
Coordinates: 50°25′31″N, 127°24′43″E