Air Miles Calculator logo

How far is Pisa from Limnos?

The distance between Limnos (Lemnos International Airport) and Pisa (Pisa International Airport) is 808 miles / 1300 kilometers / 702 nautical miles.

The driving distance from Limnos (LXS) to Pisa (PSA) is 1305 miles / 2100 kilometers, and travel time by car is about 40 hours 6 minutes.

Lemnos International Airport – Pisa International Airport

Distance arrow
808
Miles
Distance arrow
1300
Kilometers
Distance arrow
702
Nautical miles

Search flights

Distance from Limnos to Pisa

There are several ways to calculate the distance from Limnos to Pisa. Here are two standard methods:

Vincenty's formula (applied above)
  • 808.079 miles
  • 1300.478 kilometers
  • 702.202 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 806.289 miles
  • 1297.596 kilometers
  • 700.646 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Limnos to Pisa?

The estimated flight time from Lemnos International Airport to Pisa International Airport is 2 hours and 1 minutes.

Flight carbon footprint between Lemnos International Airport (LXS) and Pisa International Airport (PSA)

On average, flying from Limnos to Pisa generates about 136 kg of CO2 per passenger, and 136 kilograms equals 299 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Limnos to Pisa

See the map of the shortest flight path between Lemnos International Airport (LXS) and Pisa International Airport (PSA).

Airport information

Origin Lemnos International Airport
City: Limnos
Country: Greece Flag of Greece
IATA Code: LXS
ICAO Code: LGLM
Coordinates: 39°55′1″N, 25°14′10″E
Destination Pisa International Airport
City: Pisa
Country: Italy Flag of Italy
IATA Code: PSA
ICAO Code: LIRP
Coordinates: 43°41′2″N, 10°23′33″E