Air Miles Calculator logo

How far is Qinhuangdao from Suntar?

The distance between Suntar (Suntar Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 1558 miles / 2507 kilometers / 1354 nautical miles.

The driving distance from Suntar (SUY) to Qinhuangdao (BPE) is 2190 miles / 3525 kilometers, and travel time by car is about 64 hours 53 minutes.

Suntar Airport – Qinhuangdao Beidaihe Airport

Distance arrow
1558
Miles
Distance arrow
2507
Kilometers
Distance arrow
1354
Nautical miles

Search flights

Distance from Suntar to Qinhuangdao

There are several ways to calculate the distance from Suntar to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 1557.725 miles
  • 2506.916 kilometers
  • 1353.626 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1557.034 miles
  • 2505.803 kilometers
  • 1353.026 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Suntar to Qinhuangdao?

The estimated flight time from Suntar Airport to Qinhuangdao Beidaihe Airport is 3 hours and 26 minutes.

Flight carbon footprint between Suntar Airport (SUY) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Suntar to Qinhuangdao generates about 183 kg of CO2 per passenger, and 183 kilograms equals 404 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Suntar to Qinhuangdao

See the map of the shortest flight path between Suntar Airport (SUY) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Suntar Airport
City: Suntar
Country: Russia Flag of Russia
IATA Code: SUY
ICAO Code: UENS
Coordinates: 62°11′6″N, 117°38′6″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E