Air Miles Calculator logo

How far is Ube from San Juan?

The distance between San Juan (San Juan Luis Muñoz Marín International Airport) and Ube (Yamaguchi Ube Airport) is 8654 miles / 13928 kilometers / 7521 nautical miles.

San Juan Luis Muñoz Marín International Airport – Yamaguchi Ube Airport

Distance arrow
8654
Miles
Distance arrow
13928
Kilometers
Distance arrow
7521
Nautical miles
Flight time duration
16 h 53 min
CO2 emission
1 095 kg

Search flights

Distance from San Juan to Ube

There are several ways to calculate the distance from San Juan to Ube. Here are two standard methods:

Vincenty's formula (applied above)
  • 8654.450 miles
  • 13927.987 kilometers
  • 7520.511 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8643.744 miles
  • 13910.758 kilometers
  • 7511.209 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Juan to Ube?

The estimated flight time from San Juan Luis Muñoz Marín International Airport to Yamaguchi Ube Airport is 16 hours and 53 minutes.

Flight carbon footprint between San Juan Luis Muñoz Marín International Airport (SJU) and Yamaguchi Ube Airport (UBJ)

On average, flying from San Juan to Ube generates about 1 095 kg of CO2 per passenger, and 1 095 kilograms equals 2 415 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Juan to Ube

See the map of the shortest flight path between San Juan Luis Muñoz Marín International Airport (SJU) and Yamaguchi Ube Airport (UBJ).

Airport information

Origin San Juan Luis Muñoz Marín International Airport
City: San Juan
Country: Puerto Rico Flag of Puerto Rico
IATA Code: SJU
ICAO Code: TJSJ
Coordinates: 18°26′21″N, 66°0′6″W
Destination Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E