How far is Split from Skiathos?
The distance between Skiathos (Skiathos International Airport) and Split (Split Airport) is 480 miles / 773 kilometers / 417 nautical miles.
The driving distance from Skiathos (JSI) to Split (SPU) is 783 miles / 1260 kilometers, and travel time by car is about 26 hours 56 minutes.
Skiathos International Airport – Split Airport
Search flights
Distance from Skiathos to Split
There are several ways to calculate the distance from Skiathos to Split. Here are two standard methods:
Vincenty's formula (applied above)- 480.282 miles
- 772.939 kilometers
- 417.354 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 479.759 miles
- 772.098 kilometers
- 416.899 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Skiathos to Split?
The estimated flight time from Skiathos International Airport to Split Airport is 1 hour and 24 minutes.
What is the time difference between Skiathos and Split?
The time difference between Skiathos and Split is 1 hour. Split is 1 hour behind Skiathos.
Flight carbon footprint between Skiathos International Airport (JSI) and Split Airport (SPU)
On average, flying from Skiathos to Split generates about 96 kg of CO2 per passenger, and 96 kilograms equals 211 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Skiathos to Split
See the map of the shortest flight path between Skiathos International Airport (JSI) and Split Airport (SPU).
Airport information
Origin | Skiathos International Airport |
---|---|
City: | Skiathos |
Country: | Greece |
IATA Code: | JSI |
ICAO Code: | LGSK |
Coordinates: | 39°10′37″N, 23°30′13″E |
Destination | Split Airport |
---|---|
City: | Split |
Country: | Croatia |
IATA Code: | SPU |
ICAO Code: | LDSP |
Coordinates: | 43°32′20″N, 16°17′52″E |