How far is St. Lewis from Philadelphia, PA?
The distance between Philadelphia (Wings Field) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1256 miles / 2021 kilometers / 1091 nautical miles.
The driving distance from Philadelphia (BBX) to St. Lewis (YFX) is 1816 miles / 2923 kilometers, and travel time by car is about 42 hours 8 minutes.
Wings Field – St. Lewis (Fox Harbour) Airport
Search flights
Distance from Philadelphia to St. Lewis
There are several ways to calculate the distance from Philadelphia to St. Lewis. Here are two standard methods:
Vincenty's formula (applied above)- 1255.535 miles
- 2020.588 kilometers
- 1091.030 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1253.778 miles
- 2017.761 kilometers
- 1089.504 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Philadelphia to St. Lewis?
The estimated flight time from Wings Field to St. Lewis (Fox Harbour) Airport is 2 hours and 52 minutes.
What is the time difference between Philadelphia and St. Lewis?
Flight carbon footprint between Wings Field (BBX) and St. Lewis (Fox Harbour) Airport (YFX)
On average, flying from Philadelphia to St. Lewis generates about 164 kg of CO2 per passenger, and 164 kilograms equals 361 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Philadelphia to St. Lewis
See the map of the shortest flight path between Wings Field (BBX) and St. Lewis (Fox Harbour) Airport (YFX).
Airport information
Origin | Wings Field |
---|---|
City: | Philadelphia, PA |
Country: | United States |
IATA Code: | BBX |
ICAO Code: | KLOM |
Coordinates: | 40°8′15″N, 75°15′54″W |
Destination | St. Lewis (Fox Harbour) Airport |
---|---|
City: | St. Lewis |
Country: | Canada |
IATA Code: | YFX |
ICAO Code: | CCK4 |
Coordinates: | 52°22′22″N, 55°40′26″W |