Air Miles Calculator logo

How far is Bamaga from Newcastle?

The distance between Newcastle (Newcastle Airport) and Bamaga (Northern Peninsula Airport) is 1617 miles / 2603 kilometers / 1405 nautical miles.

The driving distance from Newcastle (NTL) to Bamaga (ABM) is 2034 miles / 3273 kilometers, and travel time by car is about 48 hours 35 minutes.

Newcastle Airport – Northern Peninsula Airport

Distance arrow
1617
Miles
Distance arrow
2603
Kilometers
Distance arrow
1405
Nautical miles

Search flights

Distance from Newcastle to Bamaga

There are several ways to calculate the distance from Newcastle to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1617.133 miles
  • 2602.523 kilometers
  • 1405.250 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1622.547 miles
  • 2611.236 kilometers
  • 1409.955 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Newcastle to Bamaga?

The estimated flight time from Newcastle Airport to Northern Peninsula Airport is 3 hours and 33 minutes.

Flight carbon footprint between Newcastle Airport (NTL) and Northern Peninsula Airport (ABM)

On average, flying from Newcastle to Bamaga generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Newcastle to Bamaga

See the map of the shortest flight path between Newcastle Airport (NTL) and Northern Peninsula Airport (ABM).

Airport information

Origin Newcastle Airport
City: Newcastle
Country: Australia Flag of Australia
IATA Code: NTL
ICAO Code: YWLM
Coordinates: 32°47′41″S, 151°50′2″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E