Air Miles Calculator logo

How far is Mabuiag Island from Parkes?

The distance between Parkes (Parkes Airport) and Mabuiag Island (Mabuiag Island Airport) is 1641 miles / 2641 kilometers / 1426 nautical miles.

The driving distance from Parkes (PKE) to Mabuiag Island (UBB) is 2013 miles / 3240 kilometers, and travel time by car is about 48 hours 17 minutes.

Parkes Airport – Mabuiag Island Airport

Distance arrow
1641
Miles
Distance arrow
2641
Kilometers
Distance arrow
1426
Nautical miles

Search flights

Distance from Parkes to Mabuiag Island

There are several ways to calculate the distance from Parkes to Mabuiag Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1641.064 miles
  • 2641.037 kilometers
  • 1426.046 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1647.360 miles
  • 2651.169 kilometers
  • 1431.517 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Parkes to Mabuiag Island?

The estimated flight time from Parkes Airport to Mabuiag Island Airport is 3 hours and 36 minutes.

Flight carbon footprint between Parkes Airport (PKE) and Mabuiag Island Airport (UBB)

On average, flying from Parkes to Mabuiag Island generates about 189 kg of CO2 per passenger, and 189 kilograms equals 416 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Parkes to Mabuiag Island

See the map of the shortest flight path between Parkes Airport (PKE) and Mabuiag Island Airport (UBB).

Airport information

Origin Parkes Airport
City: Parkes
Country: Australia Flag of Australia
IATA Code: PKE
ICAO Code: YPKS
Coordinates: 33°7′53″S, 148°14′20″E
Destination Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E