Air Miles Calculator logo

How far is Salta from Tumbes?

The distance between Tumbes (Tumbes Airport) and Salta (Martín Miguel de Güemes International Airport) is 1769 miles / 2847 kilometers / 1537 nautical miles.

The driving distance from Tumbes (TBP) to Salta (SLA) is 2377 miles / 3825 kilometers, and travel time by car is about 53 hours 18 minutes.

Tumbes Airport – Martín Miguel de Güemes International Airport

Distance arrow
1769
Miles
Distance arrow
2847
Kilometers
Distance arrow
1537
Nautical miles

Search flights

Distance from Tumbes to Salta

There are several ways to calculate the distance from Tumbes to Salta. Here are two standard methods:

Vincenty's formula (applied above)
  • 1769.024 miles
  • 2846.968 kilometers
  • 1537.240 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1774.230 miles
  • 2855.346 kilometers
  • 1541.764 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tumbes to Salta?

The estimated flight time from Tumbes Airport to Martín Miguel de Güemes International Airport is 3 hours and 50 minutes.

Flight carbon footprint between Tumbes Airport (TBP) and Martín Miguel de Güemes International Airport (SLA)

On average, flying from Tumbes to Salta generates about 198 kg of CO2 per passenger, and 198 kilograms equals 436 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tumbes to Salta

See the map of the shortest flight path between Tumbes Airport (TBP) and Martín Miguel de Güemes International Airport (SLA).

Airport information

Origin Tumbes Airport
City: Tumbes
Country: Perú Flag of Perú
IATA Code: TBP
ICAO Code: SPME
Coordinates: 3°33′9″S, 80°22′53″W
Destination Martín Miguel de Güemes International Airport
City: Salta
Country: Argentina Flag of Argentina
IATA Code: SLA
ICAO Code: SASA
Coordinates: 24°51′21″S, 65°29′10″W