Air Miles Calculator logo

How far is Qinhuangdao from Magnitogorsk?

The distance between Magnitogorsk (Magnitogorsk International Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 2929 miles / 4714 kilometers / 2545 nautical miles.

Magnitogorsk International Airport – Qinhuangdao Beidaihe Airport

Distance arrow
2929
Miles
Distance arrow
4714
Kilometers
Distance arrow
2545
Nautical miles

Search flights

Distance from Magnitogorsk to Qinhuangdao

There are several ways to calculate the distance from Magnitogorsk to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 2928.965 miles
  • 4713.713 kilometers
  • 2545.201 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2921.568 miles
  • 4701.808 kilometers
  • 2538.773 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Magnitogorsk to Qinhuangdao?

The estimated flight time from Magnitogorsk International Airport to Qinhuangdao Beidaihe Airport is 6 hours and 2 minutes.

Flight carbon footprint between Magnitogorsk International Airport (MQF) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Magnitogorsk to Qinhuangdao generates about 326 kg of CO2 per passenger, and 326 kilograms equals 718 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Magnitogorsk to Qinhuangdao

See the map of the shortest flight path between Magnitogorsk International Airport (MQF) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Magnitogorsk International Airport
City: Magnitogorsk
Country: Russia Flag of Russia
IATA Code: MQF
ICAO Code: USCM
Coordinates: 53°23′35″N, 58°45′20″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E