Air Miles Calculator logo

How far is North Platte, NE, from Beijing?

The distance between Beijing (Beijing Daxing International Airport) and North Platte (North Platte Regional Airport) is 6412 miles / 10320 kilometers / 5572 nautical miles.

Beijing Daxing International Airport – North Platte Regional Airport

Distance arrow
6412
Miles
Distance arrow
10320
Kilometers
Distance arrow
5572
Nautical miles

Search flights

Distance from Beijing to North Platte

There are several ways to calculate the distance from Beijing to North Platte. Here are two standard methods:

Vincenty's formula (applied above)
  • 6412.259 miles
  • 10319.530 kilometers
  • 5572.100 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6397.065 miles
  • 10295.078 kilometers
  • 5558.897 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beijing to North Platte?

The estimated flight time from Beijing Daxing International Airport to North Platte Regional Airport is 12 hours and 38 minutes.

Flight carbon footprint between Beijing Daxing International Airport (PKX) and North Platte Regional Airport (LBF)

On average, flying from Beijing to North Platte generates about 773 kg of CO2 per passenger, and 773 kilograms equals 1 704 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Beijing to North Platte

See the map of the shortest flight path between Beijing Daxing International Airport (PKX) and North Platte Regional Airport (LBF).

Airport information

Origin Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E
Destination North Platte Regional Airport
City: North Platte, NE
Country: United States Flag of United States
IATA Code: LBF
ICAO Code: KLBF
Coordinates: 41°7′34″N, 100°41′2″W