Air Miles Calculator logo

How far is Abbotsford from Wellington?

The distance between Wellington (Wellington International Airport) and Abbotsford (Abbotsford International Airport) is 7300 miles / 11748 kilometers / 6343 nautical miles.

Wellington International Airport – Abbotsford International Airport

Distance arrow
7300
Miles
Distance arrow
11748
Kilometers
Distance arrow
6343
Nautical miles

Search flights

Distance from Wellington to Abbotsford

There are several ways to calculate the distance from Wellington to Abbotsford. Here are two standard methods:

Vincenty's formula (applied above)
  • 7299.642 miles
  • 11747.636 kilometers
  • 6343.216 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7316.119 miles
  • 11774.152 kilometers
  • 6357.534 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wellington to Abbotsford?

The estimated flight time from Wellington International Airport to Abbotsford International Airport is 14 hours and 19 minutes.

Flight carbon footprint between Wellington International Airport (WLG) and Abbotsford International Airport (YXX)

On average, flying from Wellington to Abbotsford generates about 897 kg of CO2 per passenger, and 897 kilograms equals 1 978 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Wellington to Abbotsford

See the map of the shortest flight path between Wellington International Airport (WLG) and Abbotsford International Airport (YXX).

Airport information

Origin Wellington International Airport
City: Wellington
Country: New Zealand Flag of New Zealand
IATA Code: WLG
ICAO Code: NZWN
Coordinates: 41°19′37″S, 174°48′17″E
Destination Abbotsford International Airport
City: Abbotsford
Country: Canada Flag of Canada
IATA Code: YXX
ICAO Code: CYXX
Coordinates: 49°1′31″N, 122°21′39″W