How far is Shungnak, AK, from Nanaimo?
The distance between Nanaimo (Nanaimo Airport) and Shungnak (Shungnak Airport) is 1699 miles / 2735 kilometers / 1477 nautical miles.
The driving distance from Nanaimo (YCD) to Shungnak (SHG) is 2654 miles / 4272 kilometers, and travel time by car is about 88 hours 12 minutes.
Nanaimo Airport – Shungnak Airport
Search flights
Distance from Nanaimo to Shungnak
There are several ways to calculate the distance from Nanaimo to Shungnak. Here are two standard methods:
Vincenty's formula (applied above)- 1699.319 miles
- 2734.789 kilometers
- 1476.668 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1695.098 miles
- 2727.996 kilometers
- 1473.000 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nanaimo to Shungnak?
The estimated flight time from Nanaimo Airport to Shungnak Airport is 3 hours and 43 minutes.
What is the time difference between Nanaimo and Shungnak?
The time difference between Nanaimo and Shungnak is 1 hour. Shungnak is 1 hour behind Nanaimo.
Flight carbon footprint between Nanaimo Airport (YCD) and Shungnak Airport (SHG)
On average, flying from Nanaimo to Shungnak generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nanaimo to Shungnak
See the map of the shortest flight path between Nanaimo Airport (YCD) and Shungnak Airport (SHG).
Airport information
Origin | Nanaimo Airport |
---|---|
City: | Nanaimo |
Country: | Canada |
IATA Code: | YCD |
ICAO Code: | CYCD |
Coordinates: | 49°3′8″N, 123°52′12″W |
Destination | Shungnak Airport |
---|---|
City: | Shungnak, AK |
Country: | United States |
IATA Code: | SHG |
ICAO Code: | PAGH |
Coordinates: | 66°53′17″N, 157°9′43″W |