Air Miles Calculator logo

How far is Ulanhot from Jeju?

The distance between Jeju (Jeju International Airport) and Ulanhot (Ulanhot Yilelite Airport) is 899 miles / 1447 kilometers / 781 nautical miles.

The driving distance from Jeju (CJU) to Ulanhot (HLH) is 1125 miles / 1811 kilometers, and travel time by car is about 21 hours 33 minutes.

Jeju International Airport – Ulanhot Yilelite Airport

Distance arrow
899
Miles
Distance arrow
1447
Kilometers
Distance arrow
781
Nautical miles

Search flights

Distance from Jeju to Ulanhot

There are several ways to calculate the distance from Jeju to Ulanhot. Here are two standard methods:

Vincenty's formula (applied above)
  • 899.061 miles
  • 1446.899 kilometers
  • 781.263 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 900.139 miles
  • 1448.634 kilometers
  • 782.200 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jeju to Ulanhot?

The estimated flight time from Jeju International Airport to Ulanhot Yilelite Airport is 2 hours and 12 minutes.

Flight carbon footprint between Jeju International Airport (CJU) and Ulanhot Yilelite Airport (HLH)

On average, flying from Jeju to Ulanhot generates about 143 kg of CO2 per passenger, and 143 kilograms equals 316 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jeju to Ulanhot

See the map of the shortest flight path between Jeju International Airport (CJU) and Ulanhot Yilelite Airport (HLH).

Airport information

Origin Jeju International Airport
City: Jeju
Country: South Korea Flag of South Korea
IATA Code: CJU
ICAO Code: RKPC
Coordinates: 33°30′40″N, 126°29′34″E
Destination Ulanhot Yilelite Airport
City: Ulanhot
Country: China Flag of China
IATA Code: HLH
ICAO Code: ZBUL
Coordinates: 46°4′58″N, 122°1′1″E