Air Miles Calculator logo

How far is Rosh Pina from Bisha?

The distance between Bisha (Bisha Domestic Airport) and Rosh Pina (Rosh Pina Airport) is 995 miles / 1601 kilometers / 865 nautical miles.

The driving distance from Bisha (BHH) to Rosh Pina (RPN) is 1412 miles / 2273 kilometers, and travel time by car is about 25 hours 13 minutes.

Bisha Domestic Airport – Rosh Pina Airport

Distance arrow
995
Miles
Distance arrow
1601
Kilometers
Distance arrow
865
Nautical miles

Search flights

Distance from Bisha to Rosh Pina

There are several ways to calculate the distance from Bisha to Rosh Pina. Here are two standard methods:

Vincenty's formula (applied above)
  • 994.919 miles
  • 1601.167 kilometers
  • 864.561 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 997.463 miles
  • 1605.261 kilometers
  • 866.772 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bisha to Rosh Pina?

The estimated flight time from Bisha Domestic Airport to Rosh Pina Airport is 2 hours and 23 minutes.

Flight carbon footprint between Bisha Domestic Airport (BHH) and Rosh Pina Airport (RPN)

On average, flying from Bisha to Rosh Pina generates about 150 kg of CO2 per passenger, and 150 kilograms equals 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bisha to Rosh Pina

See the map of the shortest flight path between Bisha Domestic Airport (BHH) and Rosh Pina Airport (RPN).

Airport information

Origin Bisha Domestic Airport
City: Bisha
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: BHH
ICAO Code: OEBH
Coordinates: 19°59′3″N, 42°37′15″E
Destination Rosh Pina Airport
City: Rosh Pina
Country: Israel Flag of Israel
IATA Code: RPN
ICAO Code: LLIB
Coordinates: 32°58′51″N, 35°34′18″E