How far is Bamaga from Kalgoorlie?
The distance between Kalgoorlie (Kalgoorlie-Boulder Airport) and Bamaga (Northern Peninsula Airport) is 1918 miles / 3086 kilometers / 1666 nautical miles.
The driving distance from Kalgoorlie (KGI) to Bamaga (ABM) is 3318 miles / 5339 kilometers, and travel time by car is about 75 hours 56 minutes.
Kalgoorlie-Boulder Airport – Northern Peninsula Airport
Search flights
Distance from Kalgoorlie to Bamaga
There are several ways to calculate the distance from Kalgoorlie to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 1917.548 miles
- 3085.994 kilometers
- 1666.303 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1920.219 miles
- 3090.292 kilometers
- 1668.624 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Kalgoorlie to Bamaga?
The estimated flight time from Kalgoorlie-Boulder Airport to Northern Peninsula Airport is 4 hours and 7 minutes.
What is the time difference between Kalgoorlie and Bamaga?
The time difference between Kalgoorlie and Bamaga is 2 hours. Bamaga is 2 hours ahead of Kalgoorlie.
Flight carbon footprint between Kalgoorlie-Boulder Airport (KGI) and Northern Peninsula Airport (ABM)
On average, flying from Kalgoorlie to Bamaga generates about 210 kg of CO2 per passenger, and 210 kilograms equals 463 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Kalgoorlie to Bamaga
See the map of the shortest flight path between Kalgoorlie-Boulder Airport (KGI) and Northern Peninsula Airport (ABM).
Airport information
Origin | Kalgoorlie-Boulder Airport |
---|---|
City: | Kalgoorlie |
Country: | Australia ![]() |
IATA Code: | KGI |
ICAO Code: | YPKG |
Coordinates: | 30°47′21″S, 121°27′43″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia ![]() |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |