Air Miles Calculator logo

How far is Denham from Whyalla?

The distance between Whyalla (Whyalla Airport) and Denham (Shark Bay Airport) is 1520 miles / 2447 kilometers / 1321 nautical miles.

The driving distance from Whyalla (WYA) to Denham (MJK) is 1896 miles / 3052 kilometers, and travel time by car is about 36 hours 16 minutes.

Whyalla Airport – Shark Bay Airport

Distance arrow
1520
Miles
Distance arrow
2447
Kilometers
Distance arrow
1321
Nautical miles
Flight time duration
3 h 22 min
Time Difference
2 h 30 min
CO2 emission
181 kg

Search flights

Distance from Whyalla to Denham

There are several ways to calculate the distance from Whyalla to Denham. Here are two standard methods:

Vincenty's formula (applied above)
  • 1520.476 miles
  • 2446.969 kilometers
  • 1321.258 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1518.363 miles
  • 2443.568 kilometers
  • 1319.421 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Whyalla to Denham?

The estimated flight time from Whyalla Airport to Shark Bay Airport is 3 hours and 22 minutes.

Flight carbon footprint between Whyalla Airport (WYA) and Shark Bay Airport (MJK)

On average, flying from Whyalla to Denham generates about 181 kg of CO2 per passenger, and 181 kilograms equals 399 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Whyalla to Denham

See the map of the shortest flight path between Whyalla Airport (WYA) and Shark Bay Airport (MJK).

Airport information

Origin Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E
Destination Shark Bay Airport
City: Denham
Country: Australia Flag of Australia
IATA Code: MJK
ICAO Code: YSHK
Coordinates: 25°53′38″S, 113°34′37″E