How far is Bamaga from Maryborough?
The distance between Maryborough (Maryborough Airport) and Bamaga (Northern Peninsula Airport) is 1206 miles / 1941 kilometers / 1048 nautical miles.
The driving distance from Maryborough (MBH) to Bamaga (ABM) is 1508 miles / 2427 kilometers, and travel time by car is about 37 hours 13 minutes.
Maryborough Airport – Northern Peninsula Airport
Search flights
Distance from Maryborough to Bamaga
There are several ways to calculate the distance from Maryborough to Bamaga. Here are two standard methods:
Vincenty's formula (applied above)- 1205.866 miles
- 1940.653 kilometers
- 1047.869 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1209.139 miles
- 1945.920 kilometers
- 1050.713 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Maryborough to Bamaga?
The estimated flight time from Maryborough Airport to Northern Peninsula Airport is 2 hours and 46 minutes.
What is the time difference between Maryborough and Bamaga?
Flight carbon footprint between Maryborough Airport (MBH) and Northern Peninsula Airport (ABM)
On average, flying from Maryborough to Bamaga generates about 162 kg of CO2 per passenger, and 162 kilograms equals 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Maryborough to Bamaga
See the map of the shortest flight path between Maryborough Airport (MBH) and Northern Peninsula Airport (ABM).
Airport information
Origin | Maryborough Airport |
---|---|
City: | Maryborough |
Country: | Australia |
IATA Code: | MBH |
ICAO Code: | YMYB |
Coordinates: | 25°30′47″S, 152°42′53″E |
Destination | Northern Peninsula Airport |
---|---|
City: | Bamaga |
Country: | Australia |
IATA Code: | ABM |
ICAO Code: | YBAM |
Coordinates: | 10°57′2″S, 142°27′32″E |