Air Miles Calculator logo

How far is Shijiazhuang from Luqa?

The distance between Luqa (Malta International Airport) and Shijiazhuang (Shijiazhuang Zhengding International Airport) is 5230 miles / 8417 kilometers / 4545 nautical miles.

Malta International Airport – Shijiazhuang Zhengding International Airport

Distance arrow
5230
Miles
Distance arrow
8417
Kilometers
Distance arrow
4545
Nautical miles

Search flights

Distance from Luqa to Shijiazhuang

There are several ways to calculate the distance from Luqa to Shijiazhuang. Here are two standard methods:

Vincenty's formula (applied above)
  • 5229.978 miles
  • 8416.833 kilometers
  • 4544.726 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5218.036 miles
  • 8397.615 kilometers
  • 4534.349 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luqa to Shijiazhuang?

The estimated flight time from Malta International Airport to Shijiazhuang Zhengding International Airport is 10 hours and 24 minutes.

Flight carbon footprint between Malta International Airport (MLA) and Shijiazhuang Zhengding International Airport (SJW)

On average, flying from Luqa to Shijiazhuang generates about 614 kg of CO2 per passenger, and 614 kilograms equals 1 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Luqa to Shijiazhuang

See the map of the shortest flight path between Malta International Airport (MLA) and Shijiazhuang Zhengding International Airport (SJW).

Airport information

Origin Malta International Airport
City: Luqa
Country: Malta Flag of Malta
IATA Code: MLA
ICAO Code: LMML
Coordinates: 35°51′26″N, 14°28′39″E
Destination Shijiazhuang Zhengding International Airport
City: Shijiazhuang
Country: China Flag of China
IATA Code: SJW
ICAO Code: ZBSJ
Coordinates: 38°16′50″N, 114°41′49″E