Air Miles Calculator logo

How far is Salta from Palmas?

The distance between Palmas (Palmas Airport) and Salta (Martín Miguel de Güemes International Airport) is 1507 miles / 2425 kilometers / 1309 nautical miles.

The driving distance from Palmas (PMW) to Salta (SLA) is 2277 miles / 3665 kilometers, and travel time by car is about 48 hours 37 minutes.

Palmas Airport – Martín Miguel de Güemes International Airport

Distance arrow
1507
Miles
Distance arrow
2425
Kilometers
Distance arrow
1309
Nautical miles

Search flights

Distance from Palmas to Salta

There are several ways to calculate the distance from Palmas to Salta. Here are two standard methods:

Vincenty's formula (applied above)
  • 1506.695 miles
  • 2424.791 kilometers
  • 1309.282 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1508.593 miles
  • 2427.844 kilometers
  • 1310.931 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Palmas to Salta?

The estimated flight time from Palmas Airport to Martín Miguel de Güemes International Airport is 3 hours and 21 minutes.

What is the time difference between Palmas and Salta?

There is no time difference between Palmas and Salta.

Flight carbon footprint between Palmas Airport (PMW) and Martín Miguel de Güemes International Airport (SLA)

On average, flying from Palmas to Salta generates about 180 kg of CO2 per passenger, and 180 kilograms equals 397 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Palmas to Salta

See the map of the shortest flight path between Palmas Airport (PMW) and Martín Miguel de Güemes International Airport (SLA).

Airport information

Origin Palmas Airport
City: Palmas
Country: Brazil Flag of Brazil
IATA Code: PMW
ICAO Code: SBPJ
Coordinates: 10°17′29″S, 48°21′25″W
Destination Martín Miguel de Güemes International Airport
City: Salta
Country: Argentina Flag of Argentina
IATA Code: SLA
ICAO Code: SASA
Coordinates: 24°51′21″S, 65°29′10″W