Air Miles Calculator logo

How far is Beijing from Lancaster, PA?

The distance between Lancaster (Lancaster Airport (Pennsylvania)) and Beijing (Beijing Daxing International Airport) is 6892 miles / 11092 kilometers / 5989 nautical miles.

Lancaster Airport (Pennsylvania) – Beijing Daxing International Airport

Distance arrow
6892
Miles
Distance arrow
11092
Kilometers
Distance arrow
5989
Nautical miles

Search flights

Distance from Lancaster to Beijing

There are several ways to calculate the distance from Lancaster to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 6892.439 miles
  • 11092.305 kilometers
  • 5989.366 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6876.773 miles
  • 11067.094 kilometers
  • 5975.753 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lancaster to Beijing?

The estimated flight time from Lancaster Airport (Pennsylvania) to Beijing Daxing International Airport is 13 hours and 32 minutes.

Flight carbon footprint between Lancaster Airport (Pennsylvania) (LNS) and Beijing Daxing International Airport (PKX)

On average, flying from Lancaster to Beijing generates about 840 kg of CO2 per passenger, and 840 kilograms equals 1 851 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Lancaster to Beijing

See the map of the shortest flight path between Lancaster Airport (Pennsylvania) (LNS) and Beijing Daxing International Airport (PKX).

Airport information

Origin Lancaster Airport (Pennsylvania)
City: Lancaster, PA
Country: United States Flag of United States
IATA Code: LNS
ICAO Code: KLNS
Coordinates: 40°7′18″N, 76°17′45″W
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E