Air Miles Calculator logo

How far is Jixi from Ube?

The distance between Ube (Yamaguchi Ube Airport) and Jixi (Jixi Xingkaihu Airport) is 784 miles / 1262 kilometers / 681 nautical miles.

The driving distance from Ube (UBJ) to Jixi (JXA) is 1351 miles / 2174 kilometers, and travel time by car is about 28 hours 59 minutes.

Yamaguchi Ube Airport – Jixi Xingkaihu Airport

Distance arrow
784
Miles
Distance arrow
1262
Kilometers
Distance arrow
681
Nautical miles

Search flights

Distance from Ube to Jixi

There are several ways to calculate the distance from Ube to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 783.941 miles
  • 1261.631 kilometers
  • 681.226 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 785.122 miles
  • 1263.531 kilometers
  • 682.252 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ube to Jixi?

The estimated flight time from Yamaguchi Ube Airport to Jixi Xingkaihu Airport is 1 hour and 59 minutes.

What is the time difference between Ube and Jixi?

The time difference between Ube and Jixi is 1 hour. Jixi is 1 hour behind Ube.

Flight carbon footprint between Yamaguchi Ube Airport (UBJ) and Jixi Xingkaihu Airport (JXA)

On average, flying from Ube to Jixi generates about 133 kg of CO2 per passenger, and 133 kilograms equals 294 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ube to Jixi

See the map of the shortest flight path between Yamaguchi Ube Airport (UBJ) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E