Air Miles Calculator logo

How far is Shanghai from Baler?

The distance between Baler (Dr. Juan C. Angara Airport) and Shanghai (Shanghai Pudong International Airport) is 1061 miles / 1707 kilometers / 922 nautical miles.

Dr. Juan C. Angara Airport – Shanghai Pudong International Airport

Distance arrow
1061
Miles
Distance arrow
1707
Kilometers
Distance arrow
922
Nautical miles

Search flights

Distance from Baler to Shanghai

There are several ways to calculate the distance from Baler to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1060.936 miles
  • 1707.411 kilometers
  • 921.928 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1065.152 miles
  • 1714.196 kilometers
  • 925.592 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baler to Shanghai?

The estimated flight time from Dr. Juan C. Angara Airport to Shanghai Pudong International Airport is 2 hours and 30 minutes.

What is the time difference between Baler and Shanghai?

There is no time difference between Baler and Shanghai.

Flight carbon footprint between Dr. Juan C. Angara Airport (BQA) and Shanghai Pudong International Airport (PVG)

On average, flying from Baler to Shanghai generates about 155 kg of CO2 per passenger, and 155 kilograms equals 341 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Baler to Shanghai

See the map of the shortest flight path between Dr. Juan C. Angara Airport (BQA) and Shanghai Pudong International Airport (PVG).

Airport information

Origin Dr. Juan C. Angara Airport
City: Baler
Country: Philippines Flag of Philippines
IATA Code: BQA
ICAO Code: RPUR
Coordinates: 15°43′47″N, 121°30′0″E
Destination Shanghai Pudong International Airport
City: Shanghai
Country: China Flag of China
IATA Code: PVG
ICAO Code: ZSPD
Coordinates: 31°8′36″N, 121°48′18″E