How far is Patras from Sabetta?
The distance between Sabetta (Sabetta International Airport) and Patras (Patras Araxos Airport) is 2894 miles / 4657 kilometers / 2514 nautical miles.
Sabetta International Airport – Patras Araxos Airport
Search flights
Distance from Sabetta to Patras
There are several ways to calculate the distance from Sabetta to Patras. Here are two standard methods:
Vincenty's formula (applied above)- 2893.519 miles
- 4656.667 kilometers
- 2514.399 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2888.232 miles
- 4648.158 kilometers
- 2509.805 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sabetta to Patras?
The estimated flight time from Sabetta International Airport to Patras Araxos Airport is 5 hours and 58 minutes.
What is the time difference between Sabetta and Patras?
The time difference between Sabetta and Patras is 3 hours. Patras is 3 hours behind Sabetta.
Flight carbon footprint between Sabetta International Airport (SBT) and Patras Araxos Airport (GPA)
On average, flying from Sabetta to Patras generates about 322 kg of CO2 per passenger, and 322 kilograms equals 709 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Sabetta to Patras
See the map of the shortest flight path between Sabetta International Airport (SBT) and Patras Araxos Airport (GPA).
Airport information
Origin | Sabetta International Airport |
---|---|
City: | Sabetta |
Country: | Russia |
IATA Code: | SBT |
ICAO Code: | USDA |
Coordinates: | 71°13′9″N, 72°3′7″E |
Destination | Patras Araxos Airport |
---|---|
City: | Patras |
Country: | Greece |
IATA Code: | GPA |
ICAO Code: | LGRX |
Coordinates: | 38°9′3″N, 21°25′32″E |