Air Miles Calculator logo

How far is Sanming from Larnaca?

The distance between Larnaca (Larnaca International Airport) and Sanming (Shaxian Airport) is 4902 miles / 7889 kilometers / 4260 nautical miles.

The driving distance from Larnaca (LCA) to Sanming (SQJ) is 6311 miles / 10156 kilometers, and travel time by car is about 122 hours 48 minutes.

Larnaca International Airport – Shaxian Airport

Distance arrow
4902
Miles
Distance arrow
7889
Kilometers
Distance arrow
4260
Nautical miles

Search flights

Distance from Larnaca to Sanming

There are several ways to calculate the distance from Larnaca to Sanming. Here are two standard methods:

Vincenty's formula (applied above)
  • 4902.189 miles
  • 7889.308 kilometers
  • 4259.886 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4892.910 miles
  • 7874.375 kilometers
  • 4251.822 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Larnaca to Sanming?

The estimated flight time from Larnaca International Airport to Shaxian Airport is 9 hours and 46 minutes.

Flight carbon footprint between Larnaca International Airport (LCA) and Shaxian Airport (SQJ)

On average, flying from Larnaca to Sanming generates about 571 kg of CO2 per passenger, and 571 kilograms equals 1 259 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Larnaca to Sanming

See the map of the shortest flight path between Larnaca International Airport (LCA) and Shaxian Airport (SQJ).

Airport information

Origin Larnaca International Airport
City: Larnaca
Country: Cyprus Flag of Cyprus
IATA Code: LCA
ICAO Code: LCLK
Coordinates: 34°52′30″N, 33°37′29″E
Destination Shaxian Airport
City: Sanming
Country: China Flag of China
IATA Code: SQJ
ICAO Code: ZSSM
Coordinates: 26°25′34″N, 117°50′0″E