Air Miles Calculator logo

How far is Shanghai from Manama?

The distance between Manama (Bahrain International Airport) and Shanghai (Shanghai Pudong International Airport) is 4260 miles / 6855 kilometers / 3701 nautical miles.

The driving distance from Manama (BAH) to Shanghai (PVG) is 5837 miles / 9393 kilometers, and travel time by car is about 112 hours 10 minutes.

Bahrain International Airport – Shanghai Pudong International Airport

Distance arrow
4260
Miles
Distance arrow
6855
Kilometers
Distance arrow
3701
Nautical miles

Search flights

Distance from Manama to Shanghai

There are several ways to calculate the distance from Manama to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 4259.603 miles
  • 6855.167 kilometers
  • 3701.494 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4251.739 miles
  • 6842.510 kilometers
  • 3694.660 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manama to Shanghai?

The estimated flight time from Bahrain International Airport to Shanghai Pudong International Airport is 8 hours and 33 minutes.

Flight carbon footprint between Bahrain International Airport (BAH) and Shanghai Pudong International Airport (PVG)

On average, flying from Manama to Shanghai generates about 489 kg of CO2 per passenger, and 489 kilograms equals 1 078 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Manama to Shanghai

See the map of the shortest flight path between Bahrain International Airport (BAH) and Shanghai Pudong International Airport (PVG).

Airport information

Origin Bahrain International Airport
City: Manama
Country: Bahrain Flag of Bahrain
IATA Code: BAH
ICAO Code: OBBI
Coordinates: 26°16′14″N, 50°38′0″E
Destination Shanghai Pudong International Airport
City: Shanghai
Country: China Flag of China
IATA Code: PVG
ICAO Code: ZSPD
Coordinates: 31°8′36″N, 121°48′18″E