Air Miles Calculator logo

How far is Poplar Hill from Philadelphia, PA?

The distance between Philadelphia (Wings Field) and Poplar Hill (Poplar Hill Airport) is 1224 miles / 1970 kilometers / 1064 nautical miles.

The driving distance from Philadelphia (BBX) to Poplar Hill (YHP) is 1716 miles / 2761 kilometers, and travel time by car is about 35 hours 42 minutes.

Wings Field – Poplar Hill Airport

Distance arrow
1224
Miles
Distance arrow
1970
Kilometers
Distance arrow
1064
Nautical miles

Search flights

Distance from Philadelphia to Poplar Hill

There are several ways to calculate the distance from Philadelphia to Poplar Hill. Here are two standard methods:

Vincenty's formula (applied above)
  • 1224.360 miles
  • 1970.416 kilometers
  • 1063.940 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1222.677 miles
  • 1967.709 kilometers
  • 1062.478 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Philadelphia to Poplar Hill?

The estimated flight time from Wings Field to Poplar Hill Airport is 2 hours and 49 minutes.

Flight carbon footprint between Wings Field (BBX) and Poplar Hill Airport (YHP)

On average, flying from Philadelphia to Poplar Hill generates about 162 kg of CO2 per passenger, and 162 kilograms equals 358 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Philadelphia to Poplar Hill

See the map of the shortest flight path between Wings Field (BBX) and Poplar Hill Airport (YHP).

Airport information

Origin Wings Field
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: BBX
ICAO Code: KLOM
Coordinates: 40°8′15″N, 75°15′54″W
Destination Poplar Hill Airport
City: Poplar Hill
Country: Canada Flag of Canada
IATA Code: YHP
ICAO Code: CPV7
Coordinates: 52°6′47″N, 94°15′20″W