Air Miles Calculator logo

How far is Burlington, IA, from Beijing?

The distance between Beijing (Beijing Capital International Airport) and Burlington (Southeast Iowa Regional Airport) is 6600 miles / 10622 kilometers / 5735 nautical miles.

Beijing Capital International Airport – Southeast Iowa Regional Airport

Distance arrow
6600
Miles
Distance arrow
10622
Kilometers
Distance arrow
5735
Nautical miles

Search flights

Distance from Beijing to Burlington

There are several ways to calculate the distance from Beijing to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 6600.010 miles
  • 10621.686 kilometers
  • 5735.252 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6584.493 miles
  • 10596.715 kilometers
  • 5721.768 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beijing to Burlington?

The estimated flight time from Beijing Capital International Airport to Southeast Iowa Regional Airport is 12 hours and 59 minutes.

Flight carbon footprint between Beijing Capital International Airport (PEK) and Southeast Iowa Regional Airport (BRL)

On average, flying from Beijing to Burlington generates about 799 kg of CO2 per passenger, and 799 kilograms equals 1 761 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Beijing to Burlington

See the map of the shortest flight path between Beijing Capital International Airport (PEK) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Beijing Capital International Airport
City: Beijing
Country: China Flag of China
IATA Code: PEK
ICAO Code: ZBAA
Coordinates: 40°4′48″N, 116°35′5″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W