Air Miles Calculator logo

How far is Qinhuangdao from Kikai?

The distance between Kikai (Kikai Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 999 miles / 1607 kilometers / 868 nautical miles.

The driving distance from Kikai (KKX) to Qinhuangdao (BPE) is 1518 miles / 2443 kilometers, and travel time by car is about 113 hours 27 minutes.

Kikai Airport – Qinhuangdao Beidaihe Airport

Distance arrow
999
Miles
Distance arrow
1607
Kilometers
Distance arrow
868
Nautical miles

Search flights

Distance from Kikai to Qinhuangdao

There are several ways to calculate the distance from Kikai to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 998.717 miles
  • 1607.280 kilometers
  • 867.862 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 999.383 miles
  • 1608.351 kilometers
  • 868.440 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kikai to Qinhuangdao?

The estimated flight time from Kikai Airport to Qinhuangdao Beidaihe Airport is 2 hours and 23 minutes.

Flight carbon footprint between Kikai Airport (KKX) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Kikai to Qinhuangdao generates about 151 kg of CO2 per passenger, and 151 kilograms equals 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kikai to Qinhuangdao

See the map of the shortest flight path between Kikai Airport (KKX) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Kikai Airport
City: Kikai
Country: Japan Flag of Japan
IATA Code: KKX
ICAO Code: RJKI
Coordinates: 28°19′16″N, 129°55′40″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E