Air Miles Calculator logo

How far is Porto Seguro from Los Angeles, CA?

The distance between Los Angeles (Los Angeles International Airport) and Porto Seguro (Porto Seguro Airport) is 6257 miles / 10070 kilometers / 5437 nautical miles.

Los Angeles International Airport – Porto Seguro Airport

Distance arrow
6257
Miles
Distance arrow
10070
Kilometers
Distance arrow
5437
Nautical miles

Search flights

Distance from Los Angeles to Porto Seguro

There are several ways to calculate the distance from Los Angeles to Porto Seguro. Here are two standard methods:

Vincenty's formula (applied above)
  • 6257.297 miles
  • 10070.143 kilometers
  • 5437.442 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6260.555 miles
  • 10075.387 kilometers
  • 5440.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Los Angeles to Porto Seguro?

The estimated flight time from Los Angeles International Airport to Porto Seguro Airport is 12 hours and 20 minutes.

Flight carbon footprint between Los Angeles International Airport (LAX) and Porto Seguro Airport (BPS)

On average, flying from Los Angeles to Porto Seguro generates about 752 kg of CO2 per passenger, and 752 kilograms equals 1 657 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Los Angeles to Porto Seguro

See the map of the shortest flight path between Los Angeles International Airport (LAX) and Porto Seguro Airport (BPS).

Airport information

Origin Los Angeles International Airport
City: Los Angeles, CA
Country: United States Flag of United States
IATA Code: LAX
ICAO Code: KLAX
Coordinates: 33°56′33″N, 118°24′28″W
Destination Porto Seguro Airport
City: Porto Seguro
Country: Brazil Flag of Brazil
IATA Code: BPS
ICAO Code: SBPS
Coordinates: 16°26′18″S, 39°4′51″W