Air Miles Calculator logo

How far is Gold Coast from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Gold Coast (Gold Coast Airport) is 9304 miles / 14974 kilometers / 8085 nautical miles.

Pittsburgh International Airport – Gold Coast Airport

Distance arrow
9304
Miles
Distance arrow
14974
Kilometers
Distance arrow
8085
Nautical miles
Flight time duration
18 h 6 min
CO2 emission
1 194 kg

Search flights

Distance from Pittsburgh to Gold Coast

There are several ways to calculate the distance from Pittsburgh to Gold Coast. Here are two standard methods:

Vincenty's formula (applied above)
  • 9304.336 miles
  • 14973.877 kilometers
  • 8085.247 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9304.963 miles
  • 14974.886 kilometers
  • 8085.791 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Gold Coast?

The estimated flight time from Pittsburgh International Airport to Gold Coast Airport is 18 hours and 6 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Gold Coast Airport (OOL)

On average, flying from Pittsburgh to Gold Coast generates about 1 194 kg of CO2 per passenger, and 1 194 kilograms equals 2 632 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Gold Coast

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Gold Coast Airport (OOL).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Gold Coast Airport
City: Gold Coast
Country: Australia Flag of Australia
IATA Code: OOL
ICAO Code: YBCG
Coordinates: 28°9′51″S, 153°30′18″E