How far is Bamaga from Hamilton Island?
The distance between Hamilton Island (Great Barrier Reef Airport) and Bamaga (Northern Peninsula Airport) is 778 miles / 1252 kilometers / 676 nautical miles.
The driving distance from Hamilton Island (HTI) to Bamaga (ABM) is 1011 miles / 1627 kilometers, and travel time by car is about 31 hours 6 minutes.
Great Barrier Reef Airport – Northern Peninsula Airport
Search flights
Distance from Hamilton Island to Bamaga
There are several ways to calculate the distance from Hamilton Island to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 777.808 miles
- 1251.761 kilometers
- 675.897 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 780.098 miles
- 1255.446 kilometers
- 677.887 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Hamilton Island to Bamaga?
The estimated flight time from Great Barrier Reef Airport to Northern Peninsula Airport is 1 hour and 58 minutes.
What is the time difference between Hamilton Island and Bamaga?
There is no time difference between Hamilton Island and Bamaga.
Flight carbon footprint between Great Barrier Reef Airport (HTI) and Northern Peninsula Airport (ABM)
On average, flying from Hamilton Island to Bamaga generates about 133 kg of CO2 per passenger, and 133 kilograms equals 293 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Hamilton Island to Bamaga
See the map of the shortest flight path between Great Barrier Reef Airport (HTI) and Northern Peninsula Airport (ABM).
Airport information
Origin | Great Barrier Reef Airport |
---|---|
City: | Hamilton Island |
Country: | Australia |
IATA Code: | HTI |
ICAO Code: | YBHM |
Coordinates: | 20°21′29″S, 148°57′7″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |