Air Miles Calculator logo

How far is Windsor Locks, CT, from Hall Beach?

The distance between Hall Beach (Hall Beach Airport) and Windsor Locks (Bradley International Airport) is 1883 miles / 3030 kilometers / 1636 nautical miles.

Hall Beach Airport – Bradley International Airport

Distance arrow
1883
Miles
Distance arrow
3030
Kilometers
Distance arrow
1636
Nautical miles

Search flights

Distance from Hall Beach to Windsor Locks

There are several ways to calculate the distance from Hall Beach to Windsor Locks. Here are two standard methods:

Vincenty's formula (applied above)
  • 1882.685 miles
  • 3029.889 kilometers
  • 1636.009 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1880.410 miles
  • 3026.227 kilometers
  • 1634.032 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hall Beach to Windsor Locks?

The estimated flight time from Hall Beach Airport to Bradley International Airport is 4 hours and 3 minutes.

What is the time difference between Hall Beach and Windsor Locks?

There is no time difference between Hall Beach and Windsor Locks.

Flight carbon footprint between Hall Beach Airport (YUX) and Bradley International Airport (BDL)

On average, flying from Hall Beach to Windsor Locks generates about 207 kg of CO2 per passenger, and 207 kilograms equals 456 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hall Beach to Windsor Locks

See the map of the shortest flight path between Hall Beach Airport (YUX) and Bradley International Airport (BDL).

Airport information

Origin Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W
Destination Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W