Air Miles Calculator logo

How far is Mount Isa from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Mount Isa (Mount Isa Airport) is 10061 miles / 16191 kilometers / 8743 nautical miles.

Bradley International Airport – Mount Isa Airport

Distance arrow
10061
Miles
Distance arrow
16191
Kilometers
Distance arrow
8743
Nautical miles
Flight time duration
19 h 32 min
CO2 emission
1 311 kg

Search flights

Distance from Windsor Locks to Mount Isa

There are several ways to calculate the distance from Windsor Locks to Mount Isa. Here are two standard methods:

Vincenty's formula (applied above)
  • 10060.689 miles
  • 16191.110 kilometers
  • 8742.500 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10059.063 miles
  • 16188.493 kilometers
  • 8741.087 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Mount Isa?

The estimated flight time from Bradley International Airport to Mount Isa Airport is 19 hours and 32 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Mount Isa Airport (ISA)

On average, flying from Windsor Locks to Mount Isa generates about 1 311 kg of CO2 per passenger, and 1 311 kilograms equals 2 890 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to Mount Isa

See the map of the shortest flight path between Bradley International Airport (BDL) and Mount Isa Airport (ISA).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Mount Isa Airport
City: Mount Isa
Country: Australia Flag of Australia
IATA Code: ISA
ICAO Code: YBMA
Coordinates: 20°39′50″S, 139°29′20″E