Air Miles Calculator logo

How far is Lismore from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Lismore (Lismore Airport) is 9201 miles / 14808 kilometers / 7996 nautical miles.

Palm Beach International Airport – Lismore Airport

Distance arrow
9201
Miles
Distance arrow
14808
Kilometers
Distance arrow
7996
Nautical miles
Flight time duration
17 h 55 min
CO2 emission
1 178 kg

Search flights

Distance from West Palm Beach to Lismore

There are several ways to calculate the distance from West Palm Beach to Lismore. Here are two standard methods:

Vincenty's formula (applied above)
  • 9201.398 miles
  • 14808.215 kilometers
  • 7995.797 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9198.797 miles
  • 14804.028 kilometers
  • 7993.536 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Lismore?

The estimated flight time from Palm Beach International Airport to Lismore Airport is 17 hours and 55 minutes.

Flight carbon footprint between Palm Beach International Airport (PBI) and Lismore Airport (LSY)

On average, flying from West Palm Beach to Lismore generates about 1 178 kg of CO2 per passenger, and 1 178 kilograms equals 2 597 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from West Palm Beach to Lismore

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Lismore Airport (LSY).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Lismore Airport
City: Lismore
Country: Australia Flag of Australia
IATA Code: LSY
ICAO Code: YLIS
Coordinates: 28°49′49″S, 153°15′35″E