How far is Mabuiag Island from Weipa?
The distance between Weipa (Weipa Airport) and Mabuiag Island (Mabuiag Island Airport) is 188 miles / 303 kilometers / 164 nautical miles.
The driving distance from Weipa (WEI) to Mabuiag Island (UBB) is 252 miles / 406 kilometers, and travel time by car is about 10 hours 31 minutes.
Weipa Airport – Mabuiag Island Airport
Search flights
Distance from Weipa to Mabuiag Island
There are several ways to calculate the distance from Weipa to Mabuiag Island. Here are two standard methods:
Vincenty's formula (applied above)- 188.363 miles
- 303.141 kilometers
- 163.683 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 189.337 miles
- 304.708 kilometers
- 164.529 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Weipa to Mabuiag Island?
The estimated flight time from Weipa Airport to Mabuiag Island Airport is 51 minutes.
What is the time difference between Weipa and Mabuiag Island?
There is no time difference between Weipa and Mabuiag Island.
Flight carbon footprint between Weipa Airport (WEI) and Mabuiag Island Airport (UBB)
On average, flying from Weipa to Mabuiag Island generates about 53 kg of CO2 per passenger, and 53 kilograms equals 116 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Weipa to Mabuiag Island
See the map of the shortest flight path between Weipa Airport (WEI) and Mabuiag Island Airport (UBB).
Airport information
Origin | Weipa Airport |
---|---|
City: | Weipa |
Country: | Australia |
IATA Code: | WEI |
ICAO Code: | YBWP |
Coordinates: | 12°40′42″S, 141°55′30″E |
Destination | Mabuiag Island Airport |
---|---|
City: | Mabuiag Island |
Country: | Australia |
IATA Code: | UBB |
ICAO Code: | YMAA |
Coordinates: | 9°56′59″S, 142°10′58″E |