Air Miles Calculator logo

How far is Lopez, WA, from Porto Alegre?

The distance between Porto Alegre (Salgado Filho Porto Alegre International Airport) and Lopez (Lopez Island Airport) is 6982 miles / 11236 kilometers / 6067 nautical miles.

Salgado Filho Porto Alegre International Airport – Lopez Island Airport

Distance arrow
6982
Miles
Distance arrow
11236
Kilometers
Distance arrow
6067
Nautical miles

Search flights

Distance from Porto Alegre to Lopez

There are several ways to calculate the distance from Porto Alegre to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 6981.894 miles
  • 11236.270 kilometers
  • 6067.100 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6994.151 miles
  • 11255.995 kilometers
  • 6077.751 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Porto Alegre to Lopez?

The estimated flight time from Salgado Filho Porto Alegre International Airport to Lopez Island Airport is 13 hours and 43 minutes.

Flight carbon footprint between Salgado Filho Porto Alegre International Airport (POA) and Lopez Island Airport (LPS)

On average, flying from Porto Alegre to Lopez generates about 852 kg of CO2 per passenger, and 852 kilograms equals 1 879 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Porto Alegre to Lopez

See the map of the shortest flight path between Salgado Filho Porto Alegre International Airport (POA) and Lopez Island Airport (LPS).

Airport information

Origin Salgado Filho Porto Alegre International Airport
City: Porto Alegre
Country: Brazil Flag of Brazil
IATA Code: POA
ICAO Code: SBPA
Coordinates: 29°59′39″S, 51°10′17″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W