Air Miles Calculator logo

How far is Qinhuangdao from Luqa?

The distance between Luqa (Malta International Airport) and Qinhuangdao (Qinhuangdao Beidaihe Airport) is 5365 miles / 8635 kilometers / 4662 nautical miles.

Malta International Airport – Qinhuangdao Beidaihe Airport

Distance arrow
5365
Miles
Distance arrow
8635
Kilometers
Distance arrow
4662
Nautical miles

Search flights

Distance from Luqa to Qinhuangdao

There are several ways to calculate the distance from Luqa to Qinhuangdao. Here are two standard methods:

Vincenty's formula (applied above)
  • 5365.498 miles
  • 8634.931 kilometers
  • 4662.490 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5353.110 miles
  • 8614.996 kilometers
  • 4651.726 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luqa to Qinhuangdao?

The estimated flight time from Malta International Airport to Qinhuangdao Beidaihe Airport is 10 hours and 39 minutes.

Flight carbon footprint between Malta International Airport (MLA) and Qinhuangdao Beidaihe Airport (BPE)

On average, flying from Luqa to Qinhuangdao generates about 632 kg of CO2 per passenger, and 632 kilograms equals 1 393 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Luqa to Qinhuangdao

See the map of the shortest flight path between Malta International Airport (MLA) and Qinhuangdao Beidaihe Airport (BPE).

Airport information

Origin Malta International Airport
City: Luqa
Country: Malta Flag of Malta
IATA Code: MLA
ICAO Code: LMML
Coordinates: 35°51′26″N, 14°28′39″E
Destination Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E