Air Miles Calculator logo

How far is Barcaldine from Charleston, SC?

The distance between Charleston (Charleston International Airport) and Barcaldine (Barcaldine Airport) is 9627 miles / 15494 kilometers / 8366 nautical miles.

Charleston International Airport – Barcaldine Airport

Distance arrow
9627
Miles
Distance arrow
15494
Kilometers
Distance arrow
8366
Nautical miles
Flight time duration
18 h 43 min
CO2 emission
1 243 kg

Search flights

Distance from Charleston to Barcaldine

There are several ways to calculate the distance from Charleston to Barcaldine. Here are two standard methods:

Vincenty's formula (applied above)
  • 9627.413 miles
  • 15493.819 kilometers
  • 8365.993 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9624.122 miles
  • 15488.523 kilometers
  • 8363.133 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charleston to Barcaldine?

The estimated flight time from Charleston International Airport to Barcaldine Airport is 18 hours and 43 minutes.

Flight carbon footprint between Charleston International Airport (CHS) and Barcaldine Airport (BCI)

On average, flying from Charleston to Barcaldine generates about 1 243 kg of CO2 per passenger, and 1 243 kilograms equals 2 741 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Charleston to Barcaldine

See the map of the shortest flight path between Charleston International Airport (CHS) and Barcaldine Airport (BCI).

Airport information

Origin Charleston International Airport
City: Charleston, SC
Country: United States Flag of United States
IATA Code: CHS
ICAO Code: KCHS
Coordinates: 32°53′54″N, 80°2′25″W
Destination Barcaldine Airport
City: Barcaldine
Country: Australia Flag of Australia
IATA Code: BCI
ICAO Code: YBAR
Coordinates: 23°33′55″S, 145°18′25″E