How far is St. Lewis from Churchill?
The distance between Churchill (Churchill Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1544 miles / 2486 kilometers / 1342 nautical miles.
The driving distance from Churchill (YYQ) to St. Lewis (YFX) is 3484 miles / 5607 kilometers, and travel time by car is about 80 hours 8 minutes.
Churchill Airport – St. Lewis (Fox Harbour) Airport
Search flights
Distance from Churchill to St. Lewis
There are several ways to calculate the distance from Churchill to St. Lewis. Here are two standard methods:
Vincenty's formula (applied above)- 1544.455 miles
- 2485.560 kilometers
- 1342.095 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1539.493 miles
- 2477.574 kilometers
- 1337.783 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Churchill to St. Lewis?
The estimated flight time from Churchill Airport to St. Lewis (Fox Harbour) Airport is 3 hours and 25 minutes.
What is the time difference between Churchill and St. Lewis?
Flight carbon footprint between Churchill Airport (YYQ) and St. Lewis (Fox Harbour) Airport (YFX)
On average, flying from Churchill to St. Lewis generates about 182 kg of CO2 per passenger, and 182 kilograms equals 402 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Churchill to St. Lewis
See the map of the shortest flight path between Churchill Airport (YYQ) and St. Lewis (Fox Harbour) Airport (YFX).
Airport information
Origin | Churchill Airport |
---|---|
City: | Churchill |
Country: | Canada |
IATA Code: | YYQ |
ICAO Code: | CYYQ |
Coordinates: | 58°44′21″N, 94°3′54″W |
Destination | St. Lewis (Fox Harbour) Airport |
---|---|
City: | St. Lewis |
Country: | Canada |
IATA Code: | YFX |
ICAO Code: | CCK4 |
Coordinates: | 52°22′22″N, 55°40′26″W |