How far is London from Wabush?
The distance between Wabush (Wabush Airport) and London (London International Airport) is 948 miles / 1526 kilometers / 824 nautical miles.
The driving distance from Wabush (YWK) to London (YXU) is 1224 miles / 1970 kilometers, and travel time by car is about 28 hours 1 minutes.
Wabush Airport – London International Airport
Search flights
Distance from Wabush to London
There are several ways to calculate the distance from Wabush to London. Here are two standard methods:
Vincenty's formula (applied above)- 948.454 miles
- 1526.388 kilometers
- 824.184 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 947.130 miles
- 1524.258 kilometers
- 823.033 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wabush to London?
The estimated flight time from Wabush Airport to London International Airport is 2 hours and 17 minutes.
What is the time difference between Wabush and London?
The time difference between Wabush and London is 1 hour. London is 1 hour behind Wabush.
Flight carbon footprint between Wabush Airport (YWK) and London International Airport (YXU)
On average, flying from Wabush to London generates about 147 kg of CO2 per passenger, and 147 kilograms equals 325 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wabush to London
See the map of the shortest flight path between Wabush Airport (YWK) and London International Airport (YXU).
Airport information
Origin | Wabush Airport |
---|---|
City: | Wabush |
Country: | Canada |
IATA Code: | YWK |
ICAO Code: | CYWK |
Coordinates: | 52°55′18″N, 66°51′51″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |