Air Miles Calculator logo

How far is Lyon from Sarasota, FL?

The distance between Sarasota (Sarasota–Bradenton International Airport) and Lyon (Lyon–Saint-Exupéry Airport) is 4791 miles / 7711 kilometers / 4164 nautical miles.

Sarasota–Bradenton International Airport – Lyon–Saint-Exupéry Airport

Distance arrow
4791
Miles
Distance arrow
7711
Kilometers
Distance arrow
4164
Nautical miles

Search flights

Distance from Sarasota to Lyon

There are several ways to calculate the distance from Sarasota to Lyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 4791.452 miles
  • 7711.095 kilometers
  • 4163.658 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4782.038 miles
  • 7695.945 kilometers
  • 4155.478 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sarasota to Lyon?

The estimated flight time from Sarasota–Bradenton International Airport to Lyon–Saint-Exupéry Airport is 9 hours and 34 minutes.

Flight carbon footprint between Sarasota–Bradenton International Airport (SRQ) and Lyon–Saint-Exupéry Airport (LYS)

On average, flying from Sarasota to Lyon generates about 557 kg of CO2 per passenger, and 557 kilograms equals 1 228 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sarasota to Lyon

See the map of the shortest flight path between Sarasota–Bradenton International Airport (SRQ) and Lyon–Saint-Exupéry Airport (LYS).

Airport information

Origin Sarasota–Bradenton International Airport
City: Sarasota, FL
Country: United States Flag of United States
IATA Code: SRQ
ICAO Code: KSRQ
Coordinates: 27°23′43″N, 82°33′15″W
Destination Lyon–Saint-Exupéry Airport
City: Lyon
Country: France Flag of France
IATA Code: LYS
ICAO Code: LFLL
Coordinates: 45°43′35″N, 5°5′26″E