Air Miles Calculator logo

How far is Saidu Sharif from Linyi?

The distance between Linyi (Linyi Qiyang Airport) and Saidu Sharif (Saidu Sharif Airport) is 2591 miles / 4170 kilometers / 2251 nautical miles.

The driving distance from Linyi (LYI) to Saidu Sharif (SDT) is 3483 miles / 5605 kilometers, and travel time by car is about 64 hours 18 minutes.

Linyi Qiyang Airport – Saidu Sharif Airport

Distance arrow
2591
Miles
Distance arrow
4170
Kilometers
Distance arrow
2251
Nautical miles

Search flights

Distance from Linyi to Saidu Sharif

There are several ways to calculate the distance from Linyi to Saidu Sharif. Here are two standard methods:

Vincenty's formula (applied above)
  • 2590.868 miles
  • 4169.598 kilometers
  • 2251.403 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2585.136 miles
  • 4160.373 kilometers
  • 2246.422 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Linyi to Saidu Sharif?

The estimated flight time from Linyi Qiyang Airport to Saidu Sharif Airport is 5 hours and 24 minutes.

Flight carbon footprint between Linyi Qiyang Airport (LYI) and Saidu Sharif Airport (SDT)

On average, flying from Linyi to Saidu Sharif generates about 286 kg of CO2 per passenger, and 286 kilograms equals 630 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Linyi to Saidu Sharif

See the map of the shortest flight path between Linyi Qiyang Airport (LYI) and Saidu Sharif Airport (SDT).

Airport information

Origin Linyi Qiyang Airport
City: Linyi
Country: China Flag of China
IATA Code: LYI
ICAO Code: ZSLY
Coordinates: 35°2′45″N, 118°24′43″E
Destination Saidu Sharif Airport
City: Saidu Sharif
Country: Pakistan Flag of Pakistan
IATA Code: SDT
ICAO Code: OPSS
Coordinates: 34°48′48″N, 72°21′10″E