Air Miles Calculator logo

How far is Port Lincoln from Boston, MA?

The distance between Boston (Logan International Airport) and Port Lincoln (Port Lincoln Airport) is 10896 miles / 17536 kilometers / 9468 nautical miles.

Logan International Airport – Port Lincoln Airport

Distance arrow
10896
Miles
Distance arrow
17536
Kilometers
Distance arrow
9468
Nautical miles
Flight time duration
21 h 7 min
Time Difference
15 h 30 min
CO2 emission
1 444 kg

Search flights

Distance from Boston to Port Lincoln

There are several ways to calculate the distance from Boston to Port Lincoln. Here are two standard methods:

Vincenty's formula (applied above)
  • 10896.067 miles
  • 17535.520 kilometers
  • 9468.423 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10895.010 miles
  • 17533.819 kilometers
  • 9467.505 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Boston to Port Lincoln?

The estimated flight time from Logan International Airport to Port Lincoln Airport is 21 hours and 7 minutes.

Flight carbon footprint between Logan International Airport (BOS) and Port Lincoln Airport (PLO)

On average, flying from Boston to Port Lincoln generates about 1 444 kg of CO2 per passenger, and 1 444 kilograms equals 3 184 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Boston to Port Lincoln

See the map of the shortest flight path between Logan International Airport (BOS) and Port Lincoln Airport (PLO).

Airport information

Origin Logan International Airport
City: Boston, MA
Country: United States Flag of United States
IATA Code: BOS
ICAO Code: KBOS
Coordinates: 42°21′51″N, 71°0′18″W
Destination Port Lincoln Airport
City: Port Lincoln
Country: Australia Flag of Australia
IATA Code: PLO
ICAO Code: YPLC
Coordinates: 34°36′19″S, 135°52′48″E