How far is Whale Cove from Sudbury?
The distance between Sudbury (Sudbury Airport) and Whale Cove (Whale Cove Airport) is 1176 miles / 1892 kilometers / 1022 nautical miles.
The driving distance from Sudbury (YSB) to Whale Cove (YXN) is 1711 miles / 2753 kilometers, and travel time by car is about 38 hours 25 minutes.
Sudbury Airport – Whale Cove Airport
Search flights
Distance from Sudbury to Whale Cove
There are several ways to calculate the distance from Sudbury to Whale Cove. Here are two standard methods:
Vincenty's formula (applied above)- 1175.841 miles
- 1892.332 kilometers
- 1021.777 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1174.203 miles
- 1889.696 kilometers
- 1020.354 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sudbury to Whale Cove?
The estimated flight time from Sudbury Airport to Whale Cove Airport is 2 hours and 43 minutes.
What is the time difference between Sudbury and Whale Cove?
The time difference between Sudbury and Whale Cove is 1 hour. Whale Cove is 1 hour behind Sudbury.
Flight carbon footprint between Sudbury Airport (YSB) and Whale Cove Airport (YXN)
On average, flying from Sudbury to Whale Cove generates about 160 kg of CO2 per passenger, and 160 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Sudbury to Whale Cove
See the map of the shortest flight path between Sudbury Airport (YSB) and Whale Cove Airport (YXN).
Airport information
Origin | Sudbury Airport |
---|---|
City: | Sudbury |
Country: | Canada |
IATA Code: | YSB |
ICAO Code: | CYSB |
Coordinates: | 46°37′30″N, 80°47′56″W |
Destination | Whale Cove Airport |
---|---|
City: | Whale Cove |
Country: | Canada |
IATA Code: | YXN |
ICAO Code: | CYXN |
Coordinates: | 62°14′24″N, 92°35′53″W |