Air Miles Calculator logo

How far is Tokyo from Ube?

The distance between Ube (Yamaguchi Ube Airport) and Tokyo (Narita International Airport) is 533 miles / 857 kilometers / 463 nautical miles.

The driving distance from Ube (UBJ) to Tokyo (NRT) is 641 miles / 1032 kilometers, and travel time by car is about 12 hours 37 minutes.

Yamaguchi Ube Airport – Narita International Airport

Distance arrow
533
Miles
Distance arrow
857
Kilometers
Distance arrow
463
Nautical miles

Search flights

Distance from Ube to Tokyo

There are several ways to calculate the distance from Ube to Tokyo. Here are two standard methods:

Vincenty's formula (applied above)
  • 532.544 miles
  • 857.047 kilometers
  • 462.768 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 531.504 miles
  • 855.373 kilometers
  • 461.864 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ube to Tokyo?

The estimated flight time from Yamaguchi Ube Airport to Narita International Airport is 1 hour and 30 minutes.

What is the time difference between Ube and Tokyo?

There is no time difference between Ube and Tokyo.

Flight carbon footprint between Yamaguchi Ube Airport (UBJ) and Narita International Airport (NRT)

On average, flying from Ube to Tokyo generates about 103 kg of CO2 per passenger, and 103 kilograms equals 228 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ube to Tokyo

See the map of the shortest flight path between Yamaguchi Ube Airport (UBJ) and Narita International Airport (NRT).

Airport information

Origin Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E
Destination Narita International Airport
City: Tokyo
Country: Japan Flag of Japan
IATA Code: NRT
ICAO Code: RJAA
Coordinates: 35°45′52″N, 140°23′9″E