Air Miles Calculator logo

How far is Burlington, IA, from Shenzhen?

The distance between Shenzhen (Shenzhen Bao'an International Airport) and Burlington (Southeast Iowa Regional Airport) is 7784 miles / 12527 kilometers / 6764 nautical miles.

Shenzhen Bao'an International Airport – Southeast Iowa Regional Airport

Distance arrow
7784
Miles
Distance arrow
12527
Kilometers
Distance arrow
6764
Nautical miles

Search flights

Distance from Shenzhen to Burlington

There are several ways to calculate the distance from Shenzhen to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 7783.747 miles
  • 12526.727 kilometers
  • 6763.892 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7771.190 miles
  • 12506.518 kilometers
  • 6752.980 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shenzhen to Burlington?

The estimated flight time from Shenzhen Bao'an International Airport to Southeast Iowa Regional Airport is 15 hours and 14 minutes.

Flight carbon footprint between Shenzhen Bao'an International Airport (SZX) and Southeast Iowa Regional Airport (BRL)

On average, flying from Shenzhen to Burlington generates about 967 kg of CO2 per passenger, and 967 kilograms equals 2 132 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shenzhen to Burlington

See the map of the shortest flight path between Shenzhen Bao'an International Airport (SZX) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Shenzhen Bao'an International Airport
City: Shenzhen
Country: China Flag of China
IATA Code: SZX
ICAO Code: ZGSZ
Coordinates: 22°38′21″N, 113°48′39″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W