Air Miles Calculator logo

How far is Sitia from Cape Town?

The distance between Cape Town (Cape Town International Airport) and Sitia (Sitia Public Airport) is 4783 miles / 7698 kilometers / 4157 nautical miles.

Cape Town International Airport – Sitia Public Airport

Distance arrow
4783
Miles
Distance arrow
7698
Kilometers
Distance arrow
4157
Nautical miles

Search flights

Distance from Cape Town to Sitia

There are several ways to calculate the distance from Cape Town to Sitia. Here are two standard methods:

Vincenty's formula (applied above)
  • 4783.364 miles
  • 7698.079 kilometers
  • 4156.630 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4804.465 miles
  • 7732.037 kilometers
  • 4174.966 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cape Town to Sitia?

The estimated flight time from Cape Town International Airport to Sitia Public Airport is 9 hours and 33 minutes.

What is the time difference between Cape Town and Sitia?

There is no time difference between Cape Town and Sitia.

Flight carbon footprint between Cape Town International Airport (CPT) and Sitia Public Airport (JSH)

On average, flying from Cape Town to Sitia generates about 556 kg of CO2 per passenger, and 556 kilograms equals 1 225 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Cape Town to Sitia

See the map of the shortest flight path between Cape Town International Airport (CPT) and Sitia Public Airport (JSH).

Airport information

Origin Cape Town International Airport
City: Cape Town
Country: South Africa Flag of South Africa
IATA Code: CPT
ICAO Code: FACT
Coordinates: 33°57′53″S, 18°36′6″E
Destination Sitia Public Airport
City: Sitia
Country: Greece Flag of Greece
IATA Code: JSH
ICAO Code: LGST
Coordinates: 35°12′57″N, 26°6′4″E