Air Miles Calculator logo

How far is Saibai Island from Lismore?

The distance between Lismore (Lismore Airport) and Saibai Island (Saibai Island Airport) is 1506 miles / 2423 kilometers / 1308 nautical miles.

The driving distance from Lismore (LSY) to Saibai Island (SBR) is 1806 miles / 2907 kilometers, and travel time by car is about 43 hours 40 minutes.

Lismore Airport – Saibai Island Airport

Distance arrow
1506
Miles
Distance arrow
2423
Kilometers
Distance arrow
1308
Nautical miles

Search flights

Distance from Lismore to Saibai Island

There are several ways to calculate the distance from Lismore to Saibai Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1505.790 miles
  • 2423.334 kilometers
  • 1308.496 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1510.620 miles
  • 2431.108 kilometers
  • 1312.693 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lismore to Saibai Island?

The estimated flight time from Lismore Airport to Saibai Island Airport is 3 hours and 21 minutes.

Flight carbon footprint between Lismore Airport (LSY) and Saibai Island Airport (SBR)

On average, flying from Lismore to Saibai Island generates about 180 kg of CO2 per passenger, and 180 kilograms equals 396 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lismore to Saibai Island

See the map of the shortest flight path between Lismore Airport (LSY) and Saibai Island Airport (SBR).

Airport information

Origin Lismore Airport
City: Lismore
Country: Australia Flag of Australia
IATA Code: LSY
ICAO Code: YLIS
Coordinates: 28°49′49″S, 153°15′35″E
Destination Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E