How far is St. Lewis from Pittsburgh, PA?
The distance between Pittsburgh (Pittsburgh International Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1420 miles / 2285 kilometers / 1234 nautical miles.
The driving distance from Pittsburgh (PIT) to St. Lewis (YFX) is 1988 miles / 3200 kilometers, and travel time by car is about 45 hours 26 minutes.
Pittsburgh International Airport – St. Lewis (Fox Harbour) Airport
Search flights
Distance from Pittsburgh to St. Lewis
There are several ways to calculate the distance from Pittsburgh to St. Lewis. Here are two standard methods:
Vincenty's formula (applied above)- 1420.095 miles
- 2285.421 kilometers
- 1234.028 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1417.541 miles
- 2281.312 kilometers
- 1231.810 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Pittsburgh to St. Lewis?
The estimated flight time from Pittsburgh International Airport to St. Lewis (Fox Harbour) Airport is 3 hours and 11 minutes.
What is the time difference between Pittsburgh and St. Lewis?
Flight carbon footprint between Pittsburgh International Airport (PIT) and St. Lewis (Fox Harbour) Airport (YFX)
On average, flying from Pittsburgh to St. Lewis generates about 174 kg of CO2 per passenger, and 174 kilograms equals 385 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Pittsburgh to St. Lewis
See the map of the shortest flight path between Pittsburgh International Airport (PIT) and St. Lewis (Fox Harbour) Airport (YFX).
Airport information
Origin | Pittsburgh International Airport |
---|---|
City: | Pittsburgh, PA |
Country: | United States |
IATA Code: | PIT |
ICAO Code: | KPIT |
Coordinates: | 40°29′29″N, 80°13′58″W |
Destination | St. Lewis (Fox Harbour) Airport |
---|---|
City: | St. Lewis |
Country: | Canada |
IATA Code: | YFX |
ICAO Code: | CCK4 |
Coordinates: | 52°22′22″N, 55°40′26″W |