Air Miles Calculator logo

How far is Patras from Newcastle?

The distance between Newcastle (Newcastle Airport) and Patras (Patras Araxos Airport) is 1589 miles / 2557 kilometers / 1381 nautical miles.

The driving distance from Newcastle (NCL) to Patras (GPA) is 2231 miles / 3591 kilometers, and travel time by car is about 38 hours 23 minutes.

Newcastle Airport – Patras Araxos Airport

Distance arrow
1589
Miles
Distance arrow
2557
Kilometers
Distance arrow
1381
Nautical miles

Search flights

Distance from Newcastle to Patras

There are several ways to calculate the distance from Newcastle to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 1589.027 miles
  • 2557.291 kilometers
  • 1380.827 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1587.186 miles
  • 2554.329 kilometers
  • 1379.227 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Newcastle to Patras?

The estimated flight time from Newcastle Airport to Patras Araxos Airport is 3 hours and 30 minutes.

Flight carbon footprint between Newcastle Airport (NCL) and Patras Araxos Airport (GPA)

On average, flying from Newcastle to Patras generates about 185 kg of CO2 per passenger, and 185 kilograms equals 408 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Newcastle to Patras

See the map of the shortest flight path between Newcastle Airport (NCL) and Patras Araxos Airport (GPA).

Airport information

Origin Newcastle Airport
City: Newcastle
Country: United Kingdom Flag of United Kingdom
IATA Code: NCL
ICAO Code: EGNT
Coordinates: 55°2′14″N, 1°41′30″W
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E