Air Miles Calculator logo

How far is Syros Island from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Syros Island (Syros Island National Airport) is 5286 miles / 8507 kilometers / 4594 nautical miles.

Pittsburgh International Airport – Syros Island National Airport

Distance arrow
5286
Miles
Distance arrow
8507
Kilometers
Distance arrow
4594
Nautical miles

Search flights

Distance from Pittsburgh to Syros Island

There are several ways to calculate the distance from Pittsburgh to Syros Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 5286.177 miles
  • 8507.277 kilometers
  • 4593.562 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5273.588 miles
  • 8487.018 kilometers
  • 4582.623 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Syros Island?

The estimated flight time from Pittsburgh International Airport to Syros Island National Airport is 10 hours and 30 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Syros Island National Airport (JSY)

On average, flying from Pittsburgh to Syros Island generates about 621 kg of CO2 per passenger, and 621 kilograms equals 1 370 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Syros Island

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Syros Island National Airport (JSY).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Syros Island National Airport
City: Syros Island
Country: Greece Flag of Greece
IATA Code: JSY
ICAO Code: LGSO
Coordinates: 37°25′22″N, 24°57′3″E