Air Miles Calculator logo

How far is Patras from Khartoum?

The distance between Khartoum (Khartoum International Airport) and Patras (Patras Araxos Airport) is 1695 miles / 2728 kilometers / 1473 nautical miles.

Khartoum International Airport – Patras Araxos Airport

Distance arrow
1695
Miles
Distance arrow
2728
Kilometers
Distance arrow
1473
Nautical miles

Search flights

Distance from Khartoum to Patras

There are several ways to calculate the distance from Khartoum to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1695.324 miles
  • 2728.359 kilometers
  • 1473.196 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1699.782 miles
  • 2735.534 kilometers
  • 1477.070 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Khartoum to Patras?

The estimated flight time from Khartoum International Airport to Patras Araxos Airport is 3 hours and 42 minutes.

What is the time difference between Khartoum and Patras?

There is no time difference between Khartoum and Patras.

Flight carbon footprint between Khartoum International Airport (KRT) and Patras Araxos Airport (GPA)

On average, flying from Khartoum to Patras generates about 192 kg of CO2 per passenger, and 192 kilograms equals 424 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Khartoum to Patras

See the map of the shortest flight path between Khartoum International Airport (KRT) and Patras Araxos Airport (GPA).

Airport information

Origin Khartoum International Airport
City: Khartoum
Country: Sudan Flag of Sudan
IATA Code: KRT
ICAO Code: HSSS
Coordinates: 15°35′22″N, 32°33′11″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E