How far is Bamaga from Meekatharra?
The distance between Meekatharra (Meekatharra Airport) and Bamaga (Northern Peninsula Airport) is 1895 miles / 3049 kilometers / 1646 nautical miles.
The driving distance from Meekatharra (MKR) to Bamaga (ABM) is 3588 miles / 5774 kilometers, and travel time by car is about 76 hours 52 minutes.
Meekatharra Airport – Northern Peninsula Airport
Search flights
Distance from Meekatharra to Bamaga
There are several ways to calculate the distance from Meekatharra to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 1894.633 miles
- 3049.116 kilometers
- 1646.391 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1895.514 miles
- 3050.535 kilometers
- 1647.157 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Meekatharra to Bamaga?
The estimated flight time from Meekatharra Airport to Northern Peninsula Airport is 4 hours and 5 minutes.
What is the time difference between Meekatharra and Bamaga?
Flight carbon footprint between Meekatharra Airport (MKR) and Northern Peninsula Airport (ABM)
On average, flying from Meekatharra to Bamaga generates about 208 kg of CO2 per passenger, and 208 kilograms equals 458 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Meekatharra to Bamaga
See the map of the shortest flight path between Meekatharra Airport (MKR) and Northern Peninsula Airport (ABM).
Airport information
Origin | Meekatharra Airport |
---|---|
City: | Meekatharra |
Country: | Australia |
IATA Code: | MKR |
ICAO Code: | YMEK |
Coordinates: | 26°36′42″S, 118°32′52″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |