Air Miles Calculator logo

How far is Hervey Bay from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Hervey Bay (Hervey Bay Airport) is 9173 miles / 14763 kilometers / 7971 nautical miles.

Palm Beach International Airport – Hervey Bay Airport

Distance arrow
9173
Miles
Distance arrow
14763
Kilometers
Distance arrow
7971
Nautical miles
Flight time duration
17 h 52 min
CO2 emission
1 174 kg

Search flights

Distance from West Palm Beach to Hervey Bay

There are several ways to calculate the distance from West Palm Beach to Hervey Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 9173.336 miles
  • 14763.053 kilometers
  • 7971.411 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9169.923 miles
  • 14757.561 kilometers
  • 7968.445 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Hervey Bay?

The estimated flight time from Palm Beach International Airport to Hervey Bay Airport is 17 hours and 52 minutes.

Flight carbon footprint between Palm Beach International Airport (PBI) and Hervey Bay Airport (HVB)

On average, flying from West Palm Beach to Hervey Bay generates about 1 174 kg of CO2 per passenger, and 1 174 kilograms equals 2 587 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from West Palm Beach to Hervey Bay

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Hervey Bay Airport (HVB).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Hervey Bay Airport
City: Hervey Bay
Country: Australia Flag of Australia
IATA Code: HVB
ICAO Code: YHBA
Coordinates: 25°19′8″S, 152°52′48″E