Air Miles Calculator logo

How far is Pontes e Lacerda from Ushuaia?

The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Pontes e Lacerda (Pontes e Lacerda Airport) is 2775 miles / 4466 kilometers / 2412 nautical miles.

The driving distance from Ushuaia (USH) to Pontes e Lacerda (LCB) is 3508 miles / 5645 kilometers, and travel time by car is about 76 hours 15 minutes.

Ushuaia – Malvinas Argentinas International Airport – Pontes e Lacerda Airport

Distance arrow
2775
Miles
Distance arrow
4466
Kilometers
Distance arrow
2412
Nautical miles

Search flights

Distance from Ushuaia to Pontes e Lacerda

There are several ways to calculate the distance from Ushuaia to Pontes e Lacerda. Here are two standard methods:

Vincenty's formula (applied above)
  • 2775.277 miles
  • 4466.375 kilometers
  • 2411.649 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2780.898 miles
  • 4475.422 kilometers
  • 2416.534 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ushuaia to Pontes e Lacerda?

The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Pontes e Lacerda Airport is 5 hours and 45 minutes.

Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Pontes e Lacerda Airport (LCB)

On average, flying from Ushuaia to Pontes e Lacerda generates about 308 kg of CO2 per passenger, and 308 kilograms equals 678 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ushuaia to Pontes e Lacerda

See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Pontes e Lacerda Airport (LCB).

Airport information

Origin Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W
Destination Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W