Air Miles Calculator logo

How far is Karratha from Newcastle?

The distance between Newcastle (Newcastle Airport) and Karratha (Karratha Airport) is 2308 miles / 3714 kilometers / 2006 nautical miles.

The driving distance from Newcastle (NTL) to Karratha (KTA) is 3111 miles / 5006 kilometers, and travel time by car is about 61 hours 26 minutes.

Newcastle Airport – Karratha Airport

Distance arrow
2308
Miles
Distance arrow
3714
Kilometers
Distance arrow
2006
Nautical miles

Search flights

Distance from Newcastle to Karratha

There are several ways to calculate the distance from Newcastle to Karratha. Here are two standard methods:

Vincenty's formula (applied above)
  • 2307.960 miles
  • 3714.301 kilometers
  • 2005.562 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2305.418 miles
  • 3710.211 kilometers
  • 2003.354 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Newcastle to Karratha?

The estimated flight time from Newcastle Airport to Karratha Airport is 4 hours and 52 minutes.

Flight carbon footprint between Newcastle Airport (NTL) and Karratha Airport (KTA)

On average, flying from Newcastle to Karratha generates about 253 kg of CO2 per passenger, and 253 kilograms equals 557 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Newcastle to Karratha

See the map of the shortest flight path between Newcastle Airport (NTL) and Karratha Airport (KTA).

Airport information

Origin Newcastle Airport
City: Newcastle
Country: Australia Flag of Australia
IATA Code: NTL
ICAO Code: YWLM
Coordinates: 32°47′41″S, 151°50′2″E
Destination Karratha Airport
City: Karratha
Country: Australia Flag of Australia
IATA Code: KTA
ICAO Code: YPKA
Coordinates: 20°42′43″S, 116°46′22″E