Air Miles Calculator logo

How far is Latrobe, PA, from San Juan?

The distance between San Juan (Fernando Luis Ribas Dominicci Airport) and Latrobe (Arnold Palmer Regional Airport) is 1699 miles / 2735 kilometers / 1477 nautical miles.

Fernando Luis Ribas Dominicci Airport – Arnold Palmer Regional Airport

Distance arrow
1699
Miles
Distance arrow
2735
Kilometers
Distance arrow
1477
Nautical miles

Search flights

Distance from San Juan to Latrobe

There are several ways to calculate the distance from San Juan to Latrobe. Here are two standard methods:

Vincenty's formula (applied above)
  • 1699.143 miles
  • 2734.505 kilometers
  • 1476.514 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1702.587 miles
  • 2740.048 kilometers
  • 1479.507 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Juan to Latrobe?

The estimated flight time from Fernando Luis Ribas Dominicci Airport to Arnold Palmer Regional Airport is 3 hours and 43 minutes.

Flight carbon footprint between Fernando Luis Ribas Dominicci Airport (SIG) and Arnold Palmer Regional Airport (LBE)

On average, flying from San Juan to Latrobe generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Juan to Latrobe

See the map of the shortest flight path between Fernando Luis Ribas Dominicci Airport (SIG) and Arnold Palmer Regional Airport (LBE).

Airport information

Origin Fernando Luis Ribas Dominicci Airport
City: San Juan
Country: Puerto Rico Flag of Puerto Rico
IATA Code: SIG
ICAO Code: TJIG
Coordinates: 18°27′24″N, 66°5′53″W
Destination Arnold Palmer Regional Airport
City: Latrobe, PA
Country: United States Flag of United States
IATA Code: LBE
ICAO Code: KLBE
Coordinates: 40°16′33″N, 79°24′17″W