How far is Samjiyon from Beijing?
The distance between Beijing (Beijing Capital International Airport) and Samjiyon (Samjiyon Airport) is 630 miles / 1015 kilometers / 548 nautical miles.
The driving distance from Beijing (PEK) to Samjiyon (YJS) is 786 miles / 1265 kilometers, and travel time by car is about 15 hours 30 minutes.
Beijing Capital International Airport – Samjiyon Airport
Search flights
Distance from Beijing to Samjiyon
There are several ways to calculate the distance from Beijing to Samjiyon. Here are two standard methods:
Vincenty's formula (applied above)- 630.429 miles
- 1014.576 kilometers
- 547.827 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 628.913 miles
- 1012.138 kilometers
- 546.511 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Samjiyon?
The estimated flight time from Beijing Capital International Airport to Samjiyon Airport is 1 hour and 41 minutes.
What is the time difference between Beijing and Samjiyon?
Flight carbon footprint between Beijing Capital International Airport (PEK) and Samjiyon Airport (YJS)
On average, flying from Beijing to Samjiyon generates about 116 kg of CO2 per passenger, and 116 kilograms equals 257 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Beijing to Samjiyon
See the map of the shortest flight path between Beijing Capital International Airport (PEK) and Samjiyon Airport (YJS).
Airport information
Origin | Beijing Capital International Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | PEK |
ICAO Code: | ZBAA |
Coordinates: | 40°4′48″N, 116°35′5″E |
Destination | Samjiyon Airport |
---|---|
City: | Samjiyon |
Country: | North Korea |
IATA Code: | YJS |
ICAO Code: | ZKSE |
Coordinates: | 41°54′25″N, 128°24′35″E |