Air Miles Calculator logo

How far is Baler from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Baler (Dr. Juan C. Angara Airport) is 7256 miles / 11678 kilometers / 6305 nautical miles.

Salt Lake City International Airport – Dr. Juan C. Angara Airport

Distance arrow
7256
Miles
Distance arrow
11678
Kilometers
Distance arrow
6305
Nautical miles

Search flights

Distance from Salt Lake City to Baler

There are several ways to calculate the distance from Salt Lake City to Baler. Here are two standard methods:

Vincenty's formula (applied above)
  • 7256.138 miles
  • 11677.622 kilometers
  • 6305.412 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7245.777 miles
  • 11660.948 kilometers
  • 6296.408 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Baler?

The estimated flight time from Salt Lake City International Airport to Dr. Juan C. Angara Airport is 14 hours and 14 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Dr. Juan C. Angara Airport (BQA)

On average, flying from Salt Lake City to Baler generates about 891 kg of CO2 per passenger, and 891 kilograms equals 1 965 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Baler

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Dr. Juan C. Angara Airport (BQA).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Dr. Juan C. Angara Airport
City: Baler
Country: Philippines Flag of Philippines
IATA Code: BQA
ICAO Code: RPUR
Coordinates: 15°43′47″N, 121°30′0″E