Air Miles Calculator logo

How far is Pontes e Lacerda from San Jose?

The distance between San Jose (Juan Santamaría International Airport) and Pontes e Lacerda (Pontes e Lacerda Airport) is 2427 miles / 3906 kilometers / 2109 nautical miles.

Juan Santamaría International Airport – Pontes e Lacerda Airport

Distance arrow
2427
Miles
Distance arrow
3906
Kilometers
Distance arrow
2109
Nautical miles

Search flights

Distance from San Jose to Pontes e Lacerda

There are several ways to calculate the distance from San Jose to Pontes e Lacerda. Here are two standard methods:

Vincenty's formula (applied above)
  • 2427.039 miles
  • 3905.941 kilometers
  • 2109.039 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2432.355 miles
  • 3914.496 kilometers
  • 2113.659 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Jose to Pontes e Lacerda?

The estimated flight time from Juan Santamaría International Airport to Pontes e Lacerda Airport is 5 hours and 5 minutes.

Flight carbon footprint between Juan Santamaría International Airport (SJO) and Pontes e Lacerda Airport (LCB)

On average, flying from San Jose to Pontes e Lacerda generates about 267 kg of CO2 per passenger, and 267 kilograms equals 588 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Jose to Pontes e Lacerda

See the map of the shortest flight path between Juan Santamaría International Airport (SJO) and Pontes e Lacerda Airport (LCB).

Airport information

Origin Juan Santamaría International Airport
City: San Jose
Country: Costa Rica Flag of Costa Rica
IATA Code: SJO
ICAO Code: MROC
Coordinates: 9°59′37″N, 84°12′31″W
Destination Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W