Air Miles Calculator logo

How far is Barcaldine from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Barcaldine (Barcaldine Airport) is 9564 miles / 15392 kilometers / 8311 nautical miles.

Pittsburgh International Airport – Barcaldine Airport

Distance arrow
9564
Miles
Distance arrow
15392
Kilometers
Distance arrow
8311
Nautical miles
Flight time duration
18 h 36 min
CO2 emission
1 234 kg

Search flights

Distance from Pittsburgh to Barcaldine

There are several ways to calculate the distance from Pittsburgh to Barcaldine. Here are two standard methods:

Vincenty's formula (applied above)
  • 9563.962 miles
  • 15391.704 kilometers
  • 8310.855 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9562.802 miles
  • 15389.838 kilometers
  • 8309.848 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Barcaldine?

The estimated flight time from Pittsburgh International Airport to Barcaldine Airport is 18 hours and 36 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Barcaldine Airport (BCI)

On average, flying from Pittsburgh to Barcaldine generates about 1 234 kg of CO2 per passenger, and 1 234 kilograms equals 2 720 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Barcaldine

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Barcaldine Airport (BCI).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Barcaldine Airport
City: Barcaldine
Country: Australia Flag of Australia
IATA Code: BCI
ICAO Code: YBAR
Coordinates: 23°33′55″S, 145°18′25″E