Air Miles Calculator logo

How far is Port Augusta from Boston, MA?

The distance between Boston (Logan International Airport) and Port Augusta (Port Augusta Airport) is 10732 miles / 17271 kilometers / 9326 nautical miles.

Logan International Airport – Port Augusta Airport

Distance arrow
10732
Miles
Distance arrow
17271
Kilometers
Distance arrow
9326
Nautical miles
Flight time duration
20 h 49 min
Time Difference
15 h 30 min
CO2 emission
1 418 kg

Search flights

Distance from Boston to Port Augusta

There are several ways to calculate the distance from Boston to Port Augusta. Here are two standard methods:

Vincenty's formula (applied above)
  • 10731.704 miles
  • 17271.004 kilometers
  • 9325.596 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10730.785 miles
  • 17269.524 kilometers
  • 9324.797 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Boston to Port Augusta?

The estimated flight time from Logan International Airport to Port Augusta Airport is 20 hours and 49 minutes.

Flight carbon footprint between Logan International Airport (BOS) and Port Augusta Airport (PUG)

On average, flying from Boston to Port Augusta generates about 1 418 kg of CO2 per passenger, and 1 418 kilograms equals 3 126 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Boston to Port Augusta

See the map of the shortest flight path between Logan International Airport (BOS) and Port Augusta Airport (PUG).

Airport information

Origin Logan International Airport
City: Boston, MA
Country: United States Flag of United States
IATA Code: BOS
ICAO Code: KBOS
Coordinates: 42°21′51″N, 71°0′18″W
Destination Port Augusta Airport
City: Port Augusta
Country: Australia Flag of Australia
IATA Code: PUG
ICAO Code: YPAG
Coordinates: 32°30′24″S, 137°43′1″E