Air Miles Calculator logo

How far is Gold Coast from Saibai Island?

The distance between Saibai Island (Saibai Island Airport) and Gold Coast (Gold Coast Airport) is 1474 miles / 2372 kilometers / 1281 nautical miles.

The driving distance from Saibai Island (SBR) to Gold Coast (OOL) is 1743 miles / 2805 kilometers, and travel time by car is about 42 hours 22 minutes.

Saibai Island Airport – Gold Coast Airport

Distance arrow
1474
Miles
Distance arrow
2372
Kilometers
Distance arrow
1281
Nautical miles

Search flights

Distance from Saibai Island to Gold Coast

There are several ways to calculate the distance from Saibai Island to Gold Coast. Here are two standard methods:

Vincenty's formula (applied above)
  • 1473.606 miles
  • 2371.539 kilometers
  • 1280.529 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1478.202 miles
  • 2378.936 kilometers
  • 1284.523 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saibai Island to Gold Coast?

The estimated flight time from Saibai Island Airport to Gold Coast Airport is 3 hours and 17 minutes.

Flight carbon footprint between Saibai Island Airport (SBR) and Gold Coast Airport (OOL)

On average, flying from Saibai Island to Gold Coast generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saibai Island to Gold Coast

See the map of the shortest flight path between Saibai Island Airport (SBR) and Gold Coast Airport (OOL).

Airport information

Origin Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E
Destination Gold Coast Airport
City: Gold Coast
Country: Australia Flag of Australia
IATA Code: OOL
ICAO Code: YBCG
Coordinates: 28°9′51″S, 153°30′18″E