Air Miles Calculator logo

How far is Yanji from Kikai?

The distance between Kikai (Kikai Airport) and Yanji (Yanji Chaoyangchuan International Airport) is 1004 miles / 1616 kilometers / 873 nautical miles.

The driving distance from Kikai (KKX) to Yanji (YNJ) is 1360 miles / 2188 kilometers, and travel time by car is about 103 hours 22 minutes.

Kikai Airport – Yanji Chaoyangchuan International Airport

Distance arrow
1004
Miles
Distance arrow
1616
Kilometers
Distance arrow
873
Nautical miles

Search flights

Distance from Kikai to Yanji

There are several ways to calculate the distance from Kikai to Yanji. Here are two standard methods:

Vincenty's formula (applied above)
  • 1004.273 miles
  • 1616.221 kilometers
  • 872.690 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1006.456 miles
  • 1619.734 kilometers
  • 874.586 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kikai to Yanji?

The estimated flight time from Kikai Airport to Yanji Chaoyangchuan International Airport is 2 hours and 24 minutes.

Flight carbon footprint between Kikai Airport (KKX) and Yanji Chaoyangchuan International Airport (YNJ)

On average, flying from Kikai to Yanji generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kikai to Yanji

See the map of the shortest flight path between Kikai Airport (KKX) and Yanji Chaoyangchuan International Airport (YNJ).

Airport information

Origin Kikai Airport
City: Kikai
Country: Japan Flag of Japan
IATA Code: KKX
ICAO Code: RJKI
Coordinates: 28°19′16″N, 129°55′40″E
Destination Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E