Air Miles Calculator logo

How far is Barcaldine from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Barcaldine (Barcaldine Airport) is 9921 miles / 15966 kilometers / 8621 nautical miles.

Bradley International Airport – Barcaldine Airport

Distance arrow
9921
Miles
Distance arrow
15966
Kilometers
Distance arrow
8621
Nautical miles
Flight time duration
19 h 17 min
CO2 emission
1 289 kg

Search flights

Distance from Windsor Locks to Barcaldine

There are several ways to calculate the distance from Windsor Locks to Barcaldine. Here are two standard methods:

Vincenty's formula (applied above)
  • 9920.774 miles
  • 15965.939 kilometers
  • 8620.917 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9919.548 miles
  • 15963.964 kilometers
  • 8619.851 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Barcaldine?

The estimated flight time from Bradley International Airport to Barcaldine Airport is 19 hours and 17 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Barcaldine Airport (BCI)

On average, flying from Windsor Locks to Barcaldine generates about 1 289 kg of CO2 per passenger, and 1 289 kilograms equals 2 842 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to Barcaldine

See the map of the shortest flight path between Bradley International Airport (BDL) and Barcaldine Airport (BCI).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Barcaldine Airport
City: Barcaldine
Country: Australia Flag of Australia
IATA Code: BCI
ICAO Code: YBAR
Coordinates: 23°33′55″S, 145°18′25″E