Air Miles Calculator logo

How far is Lopez, WA, from San Salvador?

The distance between San Salvador (El Salvador International Airport) and Lopez (Lopez Island Airport) is 3094 miles / 4980 kilometers / 2689 nautical miles.

The driving distance from San Salvador (SAL) to Lopez (LPS) is 3834 miles / 6170 kilometers, and travel time by car is about 76 hours 50 minutes.

El Salvador International Airport – Lopez Island Airport

Distance arrow
3094
Miles
Distance arrow
4980
Kilometers
Distance arrow
2689
Nautical miles

Search flights

Distance from San Salvador to Lopez

There are several ways to calculate the distance from San Salvador to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 3094.348 miles
  • 4979.870 kilometers
  • 2688.915 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3097.212 miles
  • 4984.480 kilometers
  • 2691.404 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Salvador to Lopez?

The estimated flight time from El Salvador International Airport to Lopez Island Airport is 6 hours and 21 minutes.

Flight carbon footprint between El Salvador International Airport (SAL) and Lopez Island Airport (LPS)

On average, flying from San Salvador to Lopez generates about 346 kg of CO2 per passenger, and 346 kilograms equals 762 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from San Salvador to Lopez

See the map of the shortest flight path between El Salvador International Airport (SAL) and Lopez Island Airport (LPS).

Airport information

Origin El Salvador International Airport
City: San Salvador
Country: El Salvador Flag of El Salvador
IATA Code: SAL
ICAO Code: MSLP
Coordinates: 13°26′27″N, 89°3′20″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W