Air Miles Calculator logo

How far is Anahim Lake from Philadelphia, PA?

The distance between Philadelphia (Wings Field) and Anahim Lake (Anahim Lake Airport) is 2485 miles / 3999 kilometers / 2159 nautical miles.

The driving distance from Philadelphia (BBX) to Anahim Lake (YAA) is 3137 miles / 5049 kilometers, and travel time by car is about 60 hours 41 minutes.

Wings Field – Anahim Lake Airport

Distance arrow
2485
Miles
Distance arrow
3999
Kilometers
Distance arrow
2159
Nautical miles

Search flights

Distance from Philadelphia to Anahim Lake

There are several ways to calculate the distance from Philadelphia to Anahim Lake. Here are two standard methods:

Vincenty's formula (applied above)
  • 2484.668 miles
  • 3998.686 kilometers
  • 2159.118 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2478.519 miles
  • 3988.789 kilometers
  • 2153.774 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Philadelphia to Anahim Lake?

The estimated flight time from Wings Field to Anahim Lake Airport is 5 hours and 12 minutes.

Flight carbon footprint between Wings Field (BBX) and Anahim Lake Airport (YAA)

On average, flying from Philadelphia to Anahim Lake generates about 273 kg of CO2 per passenger, and 273 kilograms equals 603 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Philadelphia to Anahim Lake

See the map of the shortest flight path between Wings Field (BBX) and Anahim Lake Airport (YAA).

Airport information

Origin Wings Field
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: BBX
ICAO Code: KLOM
Coordinates: 40°8′15″N, 75°15′54″W
Destination Anahim Lake Airport
City: Anahim Lake
Country: Canada Flag of Canada
IATA Code: YAA
ICAO Code: CAJ4
Coordinates: 52°27′8″N, 125°18′10″W