Air Miles Calculator logo

How far is Ube from Kuala Lumpur?

The distance between Kuala Lumpur (Kuala Lumpur International Airport) and Ube (Yamaguchi Ube Airport) is 2870 miles / 4620 kilometers / 2494 nautical miles.

The driving distance from Kuala Lumpur (KUL) to Ube (UBJ) is 4711 miles / 7582 kilometers, and travel time by car is about 92 hours 7 minutes.

Kuala Lumpur International Airport – Yamaguchi Ube Airport

Distance arrow
2870
Miles
Distance arrow
4620
Kilometers
Distance arrow
2494
Nautical miles

Search flights

Distance from Kuala Lumpur to Ube

There are several ways to calculate the distance from Kuala Lumpur to Ube. Here are two standard methods:

Vincenty's formula (applied above)
  • 2870.460 miles
  • 4619.557 kilometers
  • 2494.361 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2875.658 miles
  • 4627.923 kilometers
  • 2498.878 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kuala Lumpur to Ube?

The estimated flight time from Kuala Lumpur International Airport to Yamaguchi Ube Airport is 5 hours and 56 minutes.

Flight carbon footprint between Kuala Lumpur International Airport (KUL) and Yamaguchi Ube Airport (UBJ)

On average, flying from Kuala Lumpur to Ube generates about 319 kg of CO2 per passenger, and 319 kilograms equals 703 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kuala Lumpur to Ube

See the map of the shortest flight path between Kuala Lumpur International Airport (KUL) and Yamaguchi Ube Airport (UBJ).

Airport information

Origin Kuala Lumpur International Airport
City: Kuala Lumpur
Country: Malaysia Flag of Malaysia
IATA Code: KUL
ICAO Code: WMKK
Coordinates: 2°44′44″N, 101°42′35″E
Destination Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E