Air Miles Calculator logo

How far is Benghazi from Los Angeles, CA?

The distance between Los Angeles (Los Angeles International Airport) and Benghazi (Benina International Airport) is 7155 miles / 11515 kilometers / 6218 nautical miles.

Los Angeles International Airport – Benina International Airport

Distance arrow
7155
Miles
Distance arrow
11515
Kilometers
Distance arrow
6218
Nautical miles

Search flights

Distance from Los Angeles to Benghazi

There are several ways to calculate the distance from Los Angeles to Benghazi. Here are two standard methods:

Vincenty's formula (applied above)
  • 7155.386 miles
  • 11515.477 kilometers
  • 6217.860 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7141.693 miles
  • 11493.441 kilometers
  • 6205.962 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Los Angeles to Benghazi?

The estimated flight time from Los Angeles International Airport to Benina International Airport is 14 hours and 2 minutes.

Flight carbon footprint between Los Angeles International Airport (LAX) and Benina International Airport (BEN)

On average, flying from Los Angeles to Benghazi generates about 877 kg of CO2 per passenger, and 877 kilograms equals 1 933 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Los Angeles to Benghazi

See the map of the shortest flight path between Los Angeles International Airport (LAX) and Benina International Airport (BEN).

Airport information

Origin Los Angeles International Airport
City: Los Angeles, CA
Country: United States Flag of United States
IATA Code: LAX
ICAO Code: KLAX
Coordinates: 33°56′33″N, 118°24′28″W
Destination Benina International Airport
City: Benghazi
Country: Libya Flag of Libya
IATA Code: BEN
ICAO Code: HLLB
Coordinates: 32°5′48″N, 20°16′10″E