Air Miles Calculator logo

How far is Philadelphia, PA, from Manama?

The distance between Manama (Bahrain International Airport) and Philadelphia (Philadelphia International Airport) is 6705 miles / 10791 kilometers / 5827 nautical miles.

Bahrain International Airport – Philadelphia International Airport

Distance arrow
6705
Miles
Distance arrow
10791
Kilometers
Distance arrow
5827
Nautical miles

Search flights

Distance from Manama to Philadelphia

There are several ways to calculate the distance from Manama to Philadelphia. Here are two standard methods:

Vincenty's formula (applied above)
  • 6705.465 miles
  • 10791.399 kilometers
  • 5826.889 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6692.820 miles
  • 10771.050 kilometers
  • 5815.902 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manama to Philadelphia?

The estimated flight time from Bahrain International Airport to Philadelphia International Airport is 13 hours and 11 minutes.

Flight carbon footprint between Bahrain International Airport (BAH) and Philadelphia International Airport (PHL)

On average, flying from Manama to Philadelphia generates about 814 kg of CO2 per passenger, and 814 kilograms equals 1 794 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Manama to Philadelphia

See the map of the shortest flight path between Bahrain International Airport (BAH) and Philadelphia International Airport (PHL).

Airport information

Origin Bahrain International Airport
City: Manama
Country: Bahrain Flag of Bahrain
IATA Code: BAH
ICAO Code: OBBI
Coordinates: 26°16′14″N, 50°38′0″E
Destination Philadelphia International Airport
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: PHL
ICAO Code: KPHL
Coordinates: 39°52′18″N, 75°14′27″W