Air Miles Calculator logo

How far is Beijing from Kikai?

The distance between Kikai (Kikai Airport) and Beijing (Beijing Daxing International Airport) is 1092 miles / 1757 kilometers / 949 nautical miles.

The driving distance from Kikai (KKX) to Beijing (PKX) is 1540 miles / 2478 kilometers, and travel time by car is about 114 hours 14 minutes.

Kikai Airport – Beijing Daxing International Airport

Distance arrow
1092
Miles
Distance arrow
1757
Kilometers
Distance arrow
949
Nautical miles

Search flights

Distance from Kikai to Beijing

There are several ways to calculate the distance from Kikai to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 1092.032 miles
  • 1757.455 kilometers
  • 948.950 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1092.193 miles
  • 1757.715 kilometers
  • 949.090 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kikai to Beijing?

The estimated flight time from Kikai Airport to Beijing Daxing International Airport is 2 hours and 34 minutes.

Flight carbon footprint between Kikai Airport (KKX) and Beijing Daxing International Airport (PKX)

On average, flying from Kikai to Beijing generates about 156 kg of CO2 per passenger, and 156 kilograms equals 345 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kikai to Beijing

See the map of the shortest flight path between Kikai Airport (KKX) and Beijing Daxing International Airport (PKX).

Airport information

Origin Kikai Airport
City: Kikai
Country: Japan Flag of Japan
IATA Code: KKX
ICAO Code: RJKI
Coordinates: 28°19′16″N, 129°55′40″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E