Air Miles Calculator logo

How far is Hughenden from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Hughenden (Hughenden Airport) is 9848 miles / 15849 kilometers / 8558 nautical miles.

Bradley International Airport – Hughenden Airport

Distance arrow
9848
Miles
Distance arrow
15849
Kilometers
Distance arrow
8558
Nautical miles
Flight time duration
19 h 8 min
CO2 emission
1 278 kg

Search flights

Distance from Windsor Locks to Hughenden

There are several ways to calculate the distance from Windsor Locks to Hughenden. Here are two standard methods:

Vincenty's formula (applied above)
  • 9848.142 miles
  • 15849.048 kilometers
  • 8557.801 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9846.523 miles
  • 15846.442 kilometers
  • 8556.394 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Hughenden?

The estimated flight time from Bradley International Airport to Hughenden Airport is 19 hours and 8 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Hughenden Airport (HGD)

On average, flying from Windsor Locks to Hughenden generates about 1 278 kg of CO2 per passenger, and 1 278 kilograms equals 2 817 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to Hughenden

See the map of the shortest flight path between Bradley International Airport (BDL) and Hughenden Airport (HGD).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Hughenden Airport
City: Hughenden
Country: Australia Flag of Australia
IATA Code: HGD
ICAO Code: YHUG
Coordinates: 20°48′54″S, 144°13′30″E