Air Miles Calculator logo

How far is San Juan from Pontes e Lacerda?

The distance between Pontes e Lacerda (Pontes e Lacerda Airport) and San Juan (Domingo Faustino Sarmiento Airport) is 1263 miles / 2033 kilometers / 1098 nautical miles.

The driving distance from Pontes e Lacerda (LCB) to San Juan (UAQ) is 1691 miles / 2722 kilometers, and travel time by car is about 41 hours 12 minutes.

Pontes e Lacerda Airport – Domingo Faustino Sarmiento Airport

Distance arrow
1263
Miles
Distance arrow
2033
Kilometers
Distance arrow
1098
Nautical miles

Search flights

Distance from Pontes e Lacerda to San Juan

There are several ways to calculate the distance from Pontes e Lacerda to San Juan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1263.371 miles
  • 2033.199 kilometers
  • 1097.840 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1266.948 miles
  • 2038.955 kilometers
  • 1100.947 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pontes e Lacerda to San Juan?

The estimated flight time from Pontes e Lacerda Airport to Domingo Faustino Sarmiento Airport is 2 hours and 53 minutes.

Flight carbon footprint between Pontes e Lacerda Airport (LCB) and Domingo Faustino Sarmiento Airport (UAQ)

On average, flying from Pontes e Lacerda to San Juan generates about 164 kg of CO2 per passenger, and 164 kilograms equals 363 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Pontes e Lacerda to San Juan

See the map of the shortest flight path between Pontes e Lacerda Airport (LCB) and Domingo Faustino Sarmiento Airport (UAQ).

Airport information

Origin Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W
Destination Domingo Faustino Sarmiento Airport
City: San Juan
Country: Argentina Flag of Argentina
IATA Code: UAQ
ICAO Code: SANU
Coordinates: 31°34′17″S, 68°25′5″W