Air Miles Calculator logo

How far is Blackwater from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Blackwater (Blackwater Airport) is 9750 miles / 15691 kilometers / 8472 nautical miles.

Bradley International Airport – Blackwater Airport

Distance arrow
9750
Miles
Distance arrow
15691
Kilometers
Distance arrow
8472
Nautical miles
Flight time duration
18 h 57 min
CO2 emission
1 262 kg

Search flights

Distance from Windsor Locks to Blackwater

There are several ways to calculate the distance from Windsor Locks to Blackwater. Here are two standard methods:

Vincenty's formula (applied above)
  • 9749.766 miles
  • 15690.727 kilometers
  • 8472.315 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9748.731 miles
  • 15689.062 kilometers
  • 8471.416 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Blackwater?

The estimated flight time from Bradley International Airport to Blackwater Airport is 18 hours and 57 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Blackwater Airport (BLT)

On average, flying from Windsor Locks to Blackwater generates about 1 262 kg of CO2 per passenger, and 1 262 kilograms equals 2 783 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to Blackwater

See the map of the shortest flight path between Bradley International Airport (BDL) and Blackwater Airport (BLT).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Blackwater Airport
City: Blackwater
Country: Australia Flag of Australia
IATA Code: BLT
ICAO Code: YBTR
Coordinates: 23°36′11″S, 148°48′25″E