Air Miles Calculator logo

How far is Ube from Harbin?

The distance between Harbin (Harbin Taiping International Airport) and Ube (Yamaguchi Ube Airport) is 850 miles / 1367 kilometers / 738 nautical miles.

The driving distance from Harbin (HRB) to Ube (UBJ) is 1216 miles / 1957 kilometers, and travel time by car is about 26 hours 40 minutes.

Harbin Taiping International Airport – Yamaguchi Ube Airport

Distance arrow
850
Miles
Distance arrow
1367
Kilometers
Distance arrow
738
Nautical miles

Search flights

Distance from Harbin to Ube

There are several ways to calculate the distance from Harbin to Ube. Here are two standard methods:

Vincenty's formula (applied above)
  • 849.541 miles
  • 1367.204 kilometers
  • 738.231 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 850.467 miles
  • 1368.693 kilometers
  • 739.035 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Harbin to Ube?

The estimated flight time from Harbin Taiping International Airport to Yamaguchi Ube Airport is 2 hours and 6 minutes.

Flight carbon footprint between Harbin Taiping International Airport (HRB) and Yamaguchi Ube Airport (UBJ)

On average, flying from Harbin to Ube generates about 139 kg of CO2 per passenger, and 139 kilograms equals 307 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Harbin to Ube

See the map of the shortest flight path between Harbin Taiping International Airport (HRB) and Yamaguchi Ube Airport (UBJ).

Airport information

Origin Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E
Destination Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E