Air Miles Calculator logo

How far is Hamilton Island from Land's End?

The distance between Land's End (Land's End Airport) and Hamilton Island (Great Barrier Reef Airport) is 9957 miles / 16024 kilometers / 8652 nautical miles.

Land's End Airport – Great Barrier Reef Airport

Distance arrow
9957
Miles
Distance arrow
16024
Kilometers
Distance arrow
8652
Nautical miles
Flight time duration
19 h 21 min
CO2 emission
1 295 kg

Search flights

Distance from Land's End to Hamilton Island

There are several ways to calculate the distance from Land's End to Hamilton Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 9956.622 miles
  • 16023.630 kilometers
  • 8652.068 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9957.547 miles
  • 16025.118 kilometers
  • 8652.872 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Land's End to Hamilton Island?

The estimated flight time from Land's End Airport to Great Barrier Reef Airport is 19 hours and 21 minutes.

Flight carbon footprint between Land's End Airport (LEQ) and Great Barrier Reef Airport (HTI)

On average, flying from Land's End to Hamilton Island generates about 1 295 kg of CO2 per passenger, and 1 295 kilograms equals 2 854 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Land's End to Hamilton Island

See the map of the shortest flight path between Land's End Airport (LEQ) and Great Barrier Reef Airport (HTI).

Airport information

Origin Land's End Airport
City: Land's End
Country: United Kingdom Flag of United Kingdom
IATA Code: LEQ
ICAO Code: EGHC
Coordinates: 50°6′10″N, 5°40′14″W
Destination Great Barrier Reef Airport
City: Hamilton Island
Country: Australia Flag of Australia
IATA Code: HTI
ICAO Code: YBHM
Coordinates: 20°21′29″S, 148°57′7″E