Air Miles Calculator logo

How far is Pittsburgh, PA, from Abbotsford?

The distance between Abbotsford (Abbotsford International Airport) and Pittsburgh (Pittsburgh International Airport) is 2125 miles / 3419 kilometers / 1846 nautical miles.

The driving distance from Abbotsford (YXX) to Pittsburgh (PIT) is 2602 miles / 4188 kilometers, and travel time by car is about 46 hours 51 minutes.

Abbotsford International Airport – Pittsburgh International Airport

Distance arrow
2125
Miles
Distance arrow
3419
Kilometers
Distance arrow
1846
Nautical miles

Search flights

Distance from Abbotsford to Pittsburgh

There are several ways to calculate the distance from Abbotsford to Pittsburgh. Here are two standard methods:

Vincenty's formula (applied above)
  • 2124.633 miles
  • 3419.265 kilometers
  • 1846.255 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2119.302 miles
  • 3410.686 kilometers
  • 1841.623 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abbotsford to Pittsburgh?

The estimated flight time from Abbotsford International Airport to Pittsburgh International Airport is 4 hours and 31 minutes.

Flight carbon footprint between Abbotsford International Airport (YXX) and Pittsburgh International Airport (PIT)

On average, flying from Abbotsford to Pittsburgh generates about 232 kg of CO2 per passenger, and 232 kilograms equals 511 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Abbotsford to Pittsburgh

See the map of the shortest flight path between Abbotsford International Airport (YXX) and Pittsburgh International Airport (PIT).

Airport information

Origin Abbotsford International Airport
City: Abbotsford
Country: Canada Flag of Canada
IATA Code: YXX
ICAO Code: CYXX
Coordinates: 49°1′31″N, 122°21′39″W
Destination Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W