Air Miles Calculator logo

How far is Baishan from Blagoveschensk?

The distance between Blagoveschensk (Ignatyevo Airport) and Baishan (Changbaishan Airport) is 577 miles / 929 kilometers / 502 nautical miles.

The driving distance from Blagoveschensk (BQS) to Baishan (NBS) is 733 miles / 1180 kilometers, and travel time by car is about 14 hours 7 minutes.

Ignatyevo Airport – Changbaishan Airport

Distance arrow
577
Miles
Distance arrow
929
Kilometers
Distance arrow
502
Nautical miles

Search flights

Distance from Blagoveschensk to Baishan

There are several ways to calculate the distance from Blagoveschensk to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 577.383 miles
  • 929.208 kilometers
  • 501.732 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 577.585 miles
  • 929.533 kilometers
  • 501.908 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Blagoveschensk to Baishan?

The estimated flight time from Ignatyevo Airport to Changbaishan Airport is 1 hour and 35 minutes.

Flight carbon footprint between Ignatyevo Airport (BQS) and Changbaishan Airport (NBS)

On average, flying from Blagoveschensk to Baishan generates about 110 kg of CO2 per passenger, and 110 kilograms equals 242 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Blagoveschensk to Baishan

See the map of the shortest flight path between Ignatyevo Airport (BQS) and Changbaishan Airport (NBS).

Airport information

Origin Ignatyevo Airport
City: Blagoveschensk
Country: Russia Flag of Russia
IATA Code: BQS
ICAO Code: UHBB
Coordinates: 50°25′31″N, 127°24′43″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E