Air Miles Calculator logo

How far is Qinhuangdao from Akita?

The distance between Akita (Akita Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 1126 miles / 1812 kilometers / 978 nautical miles.

The driving distance from Akita (AXT) to Qinhuangdao (BPE) is 1933 miles / 3111 kilometers, and travel time by car is about 40 hours 11 minutes.

Akita Airport – Qinhuangdao Beidaihe Airport

Distance arrow
1126
Miles
Distance arrow
1812
Kilometers
Distance arrow
978
Nautical miles

Search flights

Distance from Akita to Qinhuangdao

There are several ways to calculate the distance from Akita to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 1126.015 miles
  • 1812.146 kilometers
  • 978.481 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1123.224 miles
  • 1807.653 kilometers
  • 976.055 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Akita to Qinhuangdao?

The estimated flight time from Akita Airport to Qinhuangdao Beidaihe Airport is 2 hours and 37 minutes.

Flight carbon footprint between Akita Airport (AXT) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Akita to Qinhuangdao generates about 158 kg of CO2 per passenger, and 158 kilograms equals 348 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Akita to Qinhuangdao

See the map of the shortest flight path between Akita Airport (AXT) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Akita Airport
City: Akita
Country: Japan Flag of Japan
IATA Code: AXT
ICAO Code: RJSK
Coordinates: 39°36′56″N, 140°13′8″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E