Air Miles Calculator logo

How far is Wagga Wagga from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Wagga Wagga (Wagga Wagga Airport) is 9839 miles / 15834 kilometers / 8550 nautical miles.

Pittsburgh International Airport – Wagga Wagga Airport

Distance arrow
9839
Miles
Distance arrow
15834
Kilometers
Distance arrow
8550
Nautical miles
Flight time duration
19 h 7 min
CO2 emission
1 276 kg

Search flights

Distance from Pittsburgh to Wagga Wagga

There are several ways to calculate the distance from Pittsburgh to Wagga Wagga. Here are two standard methods:

Vincenty's formula (applied above)
  • 9838.610 miles
  • 15833.708 kilometers
  • 8549.518 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9839.626 miles
  • 15835.343 kilometers
  • 8550.401 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Wagga Wagga?

The estimated flight time from Pittsburgh International Airport to Wagga Wagga Airport is 19 hours and 7 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Wagga Wagga Airport (WGA)

On average, flying from Pittsburgh to Wagga Wagga generates about 1 276 kg of CO2 per passenger, and 1 276 kilograms equals 2 814 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Wagga Wagga

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Wagga Wagga Airport (WGA).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Wagga Wagga Airport
City: Wagga Wagga
Country: Australia Flag of Australia
IATA Code: WGA
ICAO Code: YSWG
Coordinates: 35°9′55″S, 147°27′57″E