Air Miles Calculator logo

How far is Patras from Salt Lake City, UT?

The distance between Salt Lake City (Salt Lake City International Airport) and Patras (Patras Araxos Airport) is 6255 miles / 10067 kilometers / 5436 nautical miles.

Salt Lake City International Airport – Patras Araxos Airport

Distance arrow
6255
Miles
Distance arrow
10067
Kilometers
Distance arrow
5436
Nautical miles

Search flights

Distance from Salt Lake City to Patras

There are several ways to calculate the distance from Salt Lake City to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 6255.089 miles
  • 10066.591 kilometers
  • 5435.524 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6240.445 miles
  • 10043.022 kilometers
  • 5422.798 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salt Lake City to Patras?

The estimated flight time from Salt Lake City International Airport to Patras Araxos Airport is 12 hours and 20 minutes.

Flight carbon footprint between Salt Lake City International Airport (SLC) and Patras Araxos Airport (GPA)

On average, flying from Salt Lake City to Patras generates about 751 kg of CO2 per passenger, and 751 kilograms equals 1 657 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Salt Lake City to Patras

See the map of the shortest flight path between Salt Lake City International Airport (SLC) and Patras Araxos Airport (GPA).

Airport information

Origin Salt Lake City International Airport
City: Salt Lake City, UT
Country: United States Flag of United States
IATA Code: SLC
ICAO Code: KSLC
Coordinates: 40°47′18″N, 111°58′40″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E