How far is Bamaga from Broken Hill?
The distance between Broken Hill (Broken Hill Airport) and Bamaga (Northern Peninsula Airport) is 1450 miles / 2333 kilometers / 1260 nautical miles.
The driving distance from Broken Hill (BHQ) to Bamaga (ABM) is 2133 miles / 3432 kilometers, and travel time by car is about 51 hours 3 minutes.
Broken Hill Airport – Northern Peninsula Airport
Search flights
Distance from Broken Hill to Bamaga
There are several ways to calculate the distance from Broken Hill to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 1449.777 miles
- 2333.190 kilometers
- 1259.822 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1455.819 miles
- 2342.913 kilometers
- 1265.072 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Broken Hill to Bamaga?
The estimated flight time from Broken Hill Airport to Northern Peninsula Airport is 3 hours and 14 minutes.
What is the time difference between Broken Hill and Bamaga?
Flight carbon footprint between Broken Hill Airport (BHQ) and Northern Peninsula Airport (ABM)
On average, flying from Broken Hill to Bamaga generates about 176 kg of CO2 per passenger, and 176 kilograms equals 389 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Broken Hill to Bamaga
See the map of the shortest flight path between Broken Hill Airport (BHQ) and Northern Peninsula Airport (ABM).
Airport information
Origin | Broken Hill Airport |
---|---|
City: | Broken Hill |
Country: | Australia |
IATA Code: | BHQ |
ICAO Code: | YBHI |
Coordinates: | 32°0′5″S, 141°28′19″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |