Air Miles Calculator logo

How far is Windsor from Albany?

The distance between Albany (Albany Airport (Western Australia)) and Windsor (Windsor International Airport) is 11212 miles / 18044 kilometers / 9743 nautical miles.

Albany Airport (Western Australia) – Windsor International Airport

Distance arrow
11212
Miles
Distance arrow
18044
Kilometers
Distance arrow
9743
Nautical miles
Flight time duration
21 h 43 min
CO2 emission
1 496 kg

Search flights

Distance from Albany to Windsor

There are several ways to calculate the distance from Albany to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 11212.126 miles
  • 18044.167 kilometers
  • 9743.071 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 11210.797 miles
  • 18042.029 kilometers
  • 9741.916 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Albany to Windsor?

The estimated flight time from Albany Airport (Western Australia) to Windsor International Airport is 21 hours and 43 minutes.

Flight carbon footprint between Albany Airport (Western Australia) (ALH) and Windsor International Airport (YQG)

On average, flying from Albany to Windsor generates about 1 496 kg of CO2 per passenger, and 1 496 kilograms equals 3 297 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Albany to Windsor

See the map of the shortest flight path between Albany Airport (Western Australia) (ALH) and Windsor International Airport (YQG).

Airport information

Origin Albany Airport (Western Australia)
City: Albany
Country: Australia Flag of Australia
IATA Code: ALH
ICAO Code: YABA
Coordinates: 34°56′35″S, 117°48′32″E
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W