Air Miles Calculator logo

How far is Baler from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Baler (Dr. Juan C. Angara Airport) is 8320 miles / 13390 kilometers / 7230 nautical miles.

Pittsburgh International Airport – Dr. Juan C. Angara Airport

Distance arrow
8320
Miles
Distance arrow
13390
Kilometers
Distance arrow
7230
Nautical miles
Flight time duration
16 h 15 min
CO2 emission
1 045 kg

Search flights

Distance from Pittsburgh to Baler

There are several ways to calculate the distance from Pittsburgh to Baler. Here are two standard methods:

Vincenty's formula (applied above)
  • 8319.911 miles
  • 13389.599 kilometers
  • 7229.805 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8309.311 miles
  • 13372.540 kilometers
  • 7220.594 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Baler?

The estimated flight time from Pittsburgh International Airport to Dr. Juan C. Angara Airport is 16 hours and 15 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Dr. Juan C. Angara Airport (BQA)

On average, flying from Pittsburgh to Baler generates about 1 045 kg of CO2 per passenger, and 1 045 kilograms equals 2 305 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Baler

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Dr. Juan C. Angara Airport (BQA).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Dr. Juan C. Angara Airport
City: Baler
Country: Philippines Flag of Philippines
IATA Code: BQA
ICAO Code: RPUR
Coordinates: 15°43′47″N, 121°30′0″E