How far is Bamaga from Argyle?
The distance between Argyle (Argyle Airport) and Bamaga (Northern Peninsula Airport) is 1019 miles / 1639 kilometers / 885 nautical miles.
The driving distance from Argyle (GYL) to Bamaga (ABM) is 2184 miles / 3515 kilometers, and travel time by car is about 51 hours 2 minutes.
Argyle Airport – Northern Peninsula Airport
Search flights
Distance from Argyle to Bamaga
There are several ways to calculate the distance from Argyle to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 1018.584 miles
- 1639.253 kilometers
- 885.126 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1018.204 miles
- 1638.641 kilometers
- 884.795 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Argyle to Bamaga?
The estimated flight time from Argyle Airport to Northern Peninsula Airport is 2 hours and 25 minutes.
What is the time difference between Argyle and Bamaga?
The time difference between Argyle and Bamaga is 2 hours. Bamaga is 2 hours ahead of Argyle.
Flight carbon footprint between Argyle Airport (GYL) and Northern Peninsula Airport (ABM)
On average, flying from Argyle to Bamaga generates about 152 kg of CO2 per passenger, and 152 kilograms equals 335 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Argyle to Bamaga
See the map of the shortest flight path between Argyle Airport (GYL) and Northern Peninsula Airport (ABM).
Airport information
Origin | Argyle Airport |
---|---|
City: | Argyle |
Country: | Australia |
IATA Code: | GYL |
ICAO Code: | YARG |
Coordinates: | 16°38′12″S, 128°27′3″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |