Air Miles Calculator logo

How far is Ulanhot from Kitakyushu?

The distance between Kitakyushu (Kitakyushu Airport) and Ulanhot (Ulanhot Yilelite Airport) is 969 miles / 1560 kilometers / 842 nautical miles.

The driving distance from Kitakyushu (KKJ) to Ulanhot (HLH) is 1212 miles / 1950 kilometers, and travel time by car is about 26 hours 10 minutes.

Kitakyushu Airport – Ulanhot Yilelite Airport

Distance arrow
969
Miles
Distance arrow
1560
Kilometers
Distance arrow
842
Nautical miles

Search flights

Distance from Kitakyushu to Ulanhot

There are several ways to calculate the distance from Kitakyushu to Ulanhot. Here are two standard methods:

Vincenty's formula (applied above)
  • 969.075 miles
  • 1559.576 kilometers
  • 842.104 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 969.558 miles
  • 1560.352 kilometers
  • 842.523 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kitakyushu to Ulanhot?

The estimated flight time from Kitakyushu Airport to Ulanhot Yilelite Airport is 2 hours and 20 minutes.

Flight carbon footprint between Kitakyushu Airport (KKJ) and Ulanhot Yilelite Airport (HLH)

On average, flying from Kitakyushu to Ulanhot generates about 149 kg of CO2 per passenger, and 149 kilograms equals 328 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kitakyushu to Ulanhot

See the map of the shortest flight path between Kitakyushu Airport (KKJ) and Ulanhot Yilelite Airport (HLH).

Airport information

Origin Kitakyushu Airport
City: Kitakyushu
Country: Japan Flag of Japan
IATA Code: KKJ
ICAO Code: RJFR
Coordinates: 33°50′45″N, 131°2′6″E
Destination Ulanhot Yilelite Airport
City: Ulanhot
Country: China Flag of China
IATA Code: HLH
ICAO Code: ZBUL
Coordinates: 46°4′58″N, 122°1′1″E