How far is Abbotsford from Ivujivik?
The distance between Ivujivik (Ivujivik Airport) and Abbotsford (Abbotsford International Airport) is 1914 miles / 3081 kilometers / 1664 nautical miles.
The driving distance from Ivujivik (YIK) to Abbotsford (YXX) is 2018 miles / 3247 kilometers, and travel time by car is about 58 hours 0 minutes.
Ivujivik Airport – Abbotsford International Airport
Search flights
Distance from Ivujivik to Abbotsford
There are several ways to calculate the distance from Ivujivik to Abbotsford. Here are two standard methods:
Vincenty's formula (applied above)- 1914.364 miles
- 3080.870 kilometers
- 1663.537 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1908.853 miles
- 3072.001 kilometers
- 1658.748 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ivujivik to Abbotsford?
The estimated flight time from Ivujivik Airport to Abbotsford International Airport is 4 hours and 7 minutes.
What is the time difference between Ivujivik and Abbotsford?
Flight carbon footprint between Ivujivik Airport (YIK) and Abbotsford International Airport (YXX)
On average, flying from Ivujivik to Abbotsford generates about 210 kg of CO2 per passenger, and 210 kilograms equals 462 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Ivujivik to Abbotsford
See the map of the shortest flight path between Ivujivik Airport (YIK) and Abbotsford International Airport (YXX).
Airport information
Origin | Ivujivik Airport |
---|---|
City: | Ivujivik |
Country: | Canada |
IATA Code: | YIK |
ICAO Code: | CYIK |
Coordinates: | 62°25′2″N, 77°55′31″W |
Destination | Abbotsford International Airport |
---|---|
City: | Abbotsford |
Country: | Canada |
IATA Code: | YXX |
ICAO Code: | CYXX |
Coordinates: | 49°1′31″N, 122°21′39″W |