Air Miles Calculator logo

How far is Bamaga from Jacquinot Bay?

The distance between Jacquinot Bay (Jacquinot Bay Airport) and Bamaga (Northern Peninsula Airport) is 718 miles / 1156 kilometers / 624 nautical miles.

Jacquinot Bay Airport – Northern Peninsula Airport

Distance arrow
718
Miles
Distance arrow
1156
Kilometers
Distance arrow
624
Nautical miles

Search flights

Distance from Jacquinot Bay to Bamaga

There are several ways to calculate the distance from Jacquinot Bay to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 718.232 miles
  • 1155.882 kilometers
  • 624.126 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 718.596 miles
  • 1156.468 kilometers
  • 624.443 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jacquinot Bay to Bamaga?

The estimated flight time from Jacquinot Bay Airport to Northern Peninsula Airport is 1 hour and 51 minutes.

What is the time difference between Jacquinot Bay and Bamaga?

There is no time difference between Jacquinot Bay and Bamaga.

Flight carbon footprint between Jacquinot Bay Airport (JAQ) and Northern Peninsula Airport (ABM)

On average, flying from Jacquinot Bay to Bamaga generates about 127 kg of CO2 per passenger, and 127 kilograms equals 279 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Jacquinot Bay to Bamaga

See the map of the shortest flight path between Jacquinot Bay Airport (JAQ) and Northern Peninsula Airport (ABM).

Airport information

Origin Jacquinot Bay Airport
City: Jacquinot Bay
Country: Papua New Guinea Flag of Papua New Guinea
IATA Code: JAQ
ICAO Code: AYJB
Coordinates: 5°39′9″S, 151°30′25″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E