Air Miles Calculator logo

How far is Bamaga from Tadji?

The distance between Tadji (Tadji Airport) and Bamaga (Northern Peninsula Airport) is 533 miles / 857 kilometers / 463 nautical miles.

Tadji Airport – Northern Peninsula Airport

Distance arrow
533
Miles
Distance arrow
857
Kilometers
Distance arrow
463
Nautical miles

Search flights

Distance from Tadji to Bamaga

There are several ways to calculate the distance from Tadji to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 532.756 miles
  • 857.388 kilometers
  • 462.952 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 535.658 miles
  • 862.057 kilometers
  • 465.474 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tadji to Bamaga?

The estimated flight time from Tadji Airport to Northern Peninsula Airport is 1 hour and 30 minutes.

What is the time difference between Tadji and Bamaga?

There is no time difference between Tadji and Bamaga.

Flight carbon footprint between Tadji Airport (TAJ) and Northern Peninsula Airport (ABM)

On average, flying from Tadji to Bamaga generates about 103 kg of CO2 per passenger, and 103 kilograms equals 228 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tadji to Bamaga

See the map of the shortest flight path between Tadji Airport (TAJ) and Northern Peninsula Airport (ABM).

Airport information

Origin Tadji Airport
City: Tadji
Country: Papua New Guinea Flag of Papua New Guinea
IATA Code: TAJ
ICAO Code: AYTJ
Coordinates: 3°11′53″S, 142°25′51″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E