How far is Rosh Pina from Ube?
The distance between Ube (Yamaguchi Ube Airport) and Rosh Pina (Rosh Pina Airport) is 5292 miles / 8516 kilometers / 4599 nautical miles.
The driving distance from Ube (UBJ) to Rosh Pina (RPN) is 6609 miles / 10636 kilometers, and travel time by car is about 130 hours 43 minutes.
Yamaguchi Ube Airport – Rosh Pina Airport
Search flights
Distance from Ube to Rosh Pina
There are several ways to calculate the distance from Ube to Rosh Pina. Here are two standard methods:
Vincenty's formula (applied above)- 5291.870 miles
- 8516.440 kilometers
- 4598.510 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5280.792 miles
- 8498.611 kilometers
- 4588.883 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ube to Rosh Pina?
The estimated flight time from Yamaguchi Ube Airport to Rosh Pina Airport is 10 hours and 31 minutes.
What is the time difference between Ube and Rosh Pina?
The time difference between Ube and Rosh Pina is 7 hours. Rosh Pina is 7 hours behind Ube.
Flight carbon footprint between Yamaguchi Ube Airport (UBJ) and Rosh Pina Airport (RPN)
On average, flying from Ube to Rosh Pina generates about 622 kg of CO2 per passenger, and 622 kilograms equals 1 371 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Ube to Rosh Pina
See the map of the shortest flight path between Yamaguchi Ube Airport (UBJ) and Rosh Pina Airport (RPN).
Airport information
Origin | Yamaguchi Ube Airport |
---|---|
City: | Ube |
Country: | Japan |
IATA Code: | UBJ |
ICAO Code: | RJDC |
Coordinates: | 33°55′48″N, 131°16′44″E |
Destination | Rosh Pina Airport |
---|---|
City: | Rosh Pina |
Country: | Israel |
IATA Code: | RPN |
ICAO Code: | LLIB |
Coordinates: | 32°58′51″N, 35°34′18″E |