How far is Saibai Island from Wagga Wagga?
The distance between Wagga Wagga (Wagga Wagga Airport) and Saibai Island (Saibai Island Airport) is 1801 miles / 2898 kilometers / 1565 nautical miles.
The driving distance from Wagga Wagga (WGA) to Saibai Island (SBR) is 2187 miles / 3520 kilometers, and travel time by car is about 52 hours 17 minutes.
Wagga Wagga Airport – Saibai Island Airport
Search flights
Distance from Wagga Wagga to Saibai Island
There are several ways to calculate the distance from Wagga Wagga to Saibai Island. Here are two standard methods:
Vincenty's formula (applied above)- 1800.729 miles
- 2897.993 kilometers
- 1564.791 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1807.716 miles
- 2909.237 kilometers
- 1570.863 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wagga Wagga to Saibai Island?
The estimated flight time from Wagga Wagga Airport to Saibai Island Airport is 3 hours and 54 minutes.
What is the time difference between Wagga Wagga and Saibai Island?
Flight carbon footprint between Wagga Wagga Airport (WGA) and Saibai Island Airport (SBR)
On average, flying from Wagga Wagga to Saibai Island generates about 200 kg of CO2 per passenger, and 200 kilograms equals 441 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wagga Wagga to Saibai Island
See the map of the shortest flight path between Wagga Wagga Airport (WGA) and Saibai Island Airport (SBR).
Airport information
Origin | Wagga Wagga Airport |
---|---|
City: | Wagga Wagga |
Country: | Australia |
IATA Code: | WGA |
ICAO Code: | YSWG |
Coordinates: | 35°9′55″S, 147°27′57″E |
Destination | Saibai Island Airport |
---|---|
City: | Saibai Island |
Country: | Australia |
IATA Code: | SBR |
ICAO Code: | YSII |
Coordinates: | 9°22′41″S, 142°37′30″E |