Air Miles Calculator logo

How far is Pisa from Land's End?

The distance between Land's End (Land's End Airport) and Pisa (Pisa International Airport) is 878 miles / 1413 kilometers / 763 nautical miles.

The driving distance from Land's End (LEQ) to Pisa (PSA) is 1233 miles / 1985 kilometers, and travel time by car is about 21 hours 35 minutes.

Land's End Airport – Pisa International Airport

Distance arrow
878
Miles
Distance arrow
1413
Kilometers
Distance arrow
763
Nautical miles

Search flights

Distance from Land's End to Pisa

There are several ways to calculate the distance from Land's End to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 877.766 miles
  • 1412.628 kilometers
  • 762.758 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 875.926 miles
  • 1409.667 kilometers
  • 761.159 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Land's End to Pisa?

The estimated flight time from Land's End Airport to Pisa International Airport is 2 hours and 9 minutes.

Flight carbon footprint between Land's End Airport (LEQ) and Pisa International Airport (PSA)

On average, flying from Land's End to Pisa generates about 142 kg of CO2 per passenger, and 142 kilograms equals 312 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Land's End to Pisa

See the map of the shortest flight path between Land's End Airport (LEQ) and Pisa International Airport (PSA).

Airport information

Origin Land's End Airport
City: Land's End
Country: United Kingdom Flag of United Kingdom
IATA Code: LEQ
ICAO Code: EGHC
Coordinates: 50°6′10″N, 5°40′14″W
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E