Air Miles Calculator logo

How far is Gold Coast from Kubin Island?

The distance between Kubin Island (Kubin Airport) and Gold Coast (Gold Coast Airport) is 1435 miles / 2310 kilometers / 1247 nautical miles.

The driving distance from Kubin Island (KUG) to Gold Coast (OOL) is 1743 miles / 2805 kilometers, and travel time by car is about 42 hours 22 minutes.

Kubin Airport – Gold Coast Airport

Distance arrow
1435
Miles
Distance arrow
2310
Kilometers
Distance arrow
1247
Nautical miles

Search flights

Distance from Kubin Island to Gold Coast

There are several ways to calculate the distance from Kubin Island to Gold Coast. Here are two standard methods:

Vincenty's formula (applied above)
  • 1435.452 miles
  • 2310.136 kilometers
  • 1247.374 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1439.626 miles
  • 2316.854 kilometers
  • 1251.001 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kubin Island to Gold Coast?

The estimated flight time from Kubin Airport to Gold Coast Airport is 3 hours and 13 minutes.

Flight carbon footprint between Kubin Airport (KUG) and Gold Coast Airport (OOL)

On average, flying from Kubin Island to Gold Coast generates about 175 kg of CO2 per passenger, and 175 kilograms equals 387 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kubin Island to Gold Coast

See the map of the shortest flight path between Kubin Airport (KUG) and Gold Coast Airport (OOL).

Airport information

Origin Kubin Airport
City: Kubin Island
Country: Australia Flag of Australia
IATA Code: KUG
ICAO Code: YKUB
Coordinates: 10°13′30″S, 142°13′4″E
Destination Gold Coast Airport
City: Gold Coast
Country: Australia Flag of Australia
IATA Code: OOL
ICAO Code: YBCG
Coordinates: 28°9′51″S, 153°30′18″E