Air Miles Calculator logo

How far is Nandayure from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Nandayure (Punta Islita Airport) is 2680 miles / 4313 kilometers / 2329 nautical miles.

The driving distance from Salt Lake City (SLC) to Nandayure (PBP) is 3480 miles / 5601 kilometers, and travel time by car is about 70 hours 51 minutes.

Salt Lake City International Airport – Punta Islita Airport

Distance arrow
2680
Miles
Distance arrow
4313
Kilometers
Distance arrow
2329
Nautical miles

Search flights

Distance from Salt Lake City to Nandayure

There are several ways to calculate the distance from Salt Lake City to Nandayure. Here are two standard methods:

Vincenty's formula (applied above)
  • 2679.968 miles
  • 4312.991 kilometers
  • 2328.829 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2684.364 miles
  • 4320.065 kilometers
  • 2332.648 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Nandayure?

The estimated flight time from Salt Lake City International Airport to Punta Islita Airport is 5 hours and 34 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Punta Islita Airport (PBP)

On average, flying from Salt Lake City to Nandayure generates about 296 kg of CO2 per passenger, and 296 kilograms equals 653 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Salt Lake City to Nandayure

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Punta Islita Airport (PBP).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Punta Islita Airport
City: Nandayure
Country: Costa Rica Flag of Costa Rica
IATA Code: PBP
ICAO Code: MRIA
Coordinates: 9°51′21″N, 85°22′14″W