Air Miles Calculator logo

How far is Shihezi from Larnaca?

The distance between Larnaca (Larnaca International Airport) and Shihezi (Shihezi Huayuan Airport) is 2815 miles / 4531 kilometers / 2447 nautical miles.

The driving distance from Larnaca (LCA) to Shihezi (SHF) is 3773 miles / 6072 kilometers, and travel time by car is about 77 hours 12 minutes.

Larnaca International Airport – Shihezi Huayuan Airport

Distance arrow
2815
Miles
Distance arrow
4531
Kilometers
Distance arrow
2447
Nautical miles

Search flights

Distance from Larnaca to Shihezi

There are several ways to calculate the distance from Larnaca to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2815.465 miles
  • 4531.051 kilometers
  • 2446.572 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2809.112 miles
  • 4520.827 kilometers
  • 2441.051 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Larnaca to Shihezi?

The estimated flight time from Larnaca International Airport to Shihezi Huayuan Airport is 5 hours and 49 minutes.

Flight carbon footprint between Larnaca International Airport (LCA) and Shihezi Huayuan Airport (SHF)

On average, flying from Larnaca to Shihezi generates about 312 kg of CO2 per passenger, and 312 kilograms equals 688 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Larnaca to Shihezi

See the map of the shortest flight path between Larnaca International Airport (LCA) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Larnaca International Airport
City: Larnaca
Country: Cyprus Flag of Cyprus
IATA Code: LCA
ICAO Code: LCLK
Coordinates: 34°52′30″N, 33°37′29″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E