Air Miles Calculator logo

How far is Rosh Pina from Benghazi?

The distance between Benghazi (Benina International Airport) and Rosh Pina (Rosh Pina Airport) is 894 miles / 1440 kilometers / 777 nautical miles.

Benina International Airport – Rosh Pina Airport

Distance arrow
894
Miles
Distance arrow
1440
Kilometers
Distance arrow
777
Nautical miles

Search flights

Distance from Benghazi to Rosh Pina

There are several ways to calculate the distance from Benghazi to Rosh Pina. Here are two standard methods:

Vincenty's formula (applied above)
  • 894.476 miles
  • 1439.520 kilometers
  • 777.279 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 892.631 miles
  • 1436.550 kilometers
  • 775.675 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Benghazi to Rosh Pina?

The estimated flight time from Benina International Airport to Rosh Pina Airport is 2 hours and 11 minutes.

What is the time difference between Benghazi and Rosh Pina?

There is no time difference between Benghazi and Rosh Pina.

Flight carbon footprint between Benina International Airport (BEN) and Rosh Pina Airport (RPN)

On average, flying from Benghazi to Rosh Pina generates about 143 kg of CO2 per passenger, and 143 kilograms equals 315 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Benghazi to Rosh Pina

See the map of the shortest flight path between Benina International Airport (BEN) and Rosh Pina Airport (RPN).

Airport information

Origin Benina International Airport
City: Benghazi
Country: Libya Flag of Libya
IATA Code: BEN
ICAO Code: HLLB
Coordinates: 32°5′48″N, 20°16′10″E
Destination Rosh Pina Airport
City: Rosh Pina
Country: Israel Flag of Israel
IATA Code: RPN
ICAO Code: LLIB
Coordinates: 32°58′51″N, 35°34′18″E