Air Miles Calculator logo

How far is Burlington, IA, from Ningbo?

The distance between Ningbo (Ningbo Lishe International Airport) and Burlington (Southeast Iowa Regional Airport) is 7145 miles / 11499 kilometers / 6209 nautical miles.

Ningbo Lishe International Airport – Southeast Iowa Regional Airport

Distance arrow
7145
Miles
Distance arrow
11499
Kilometers
Distance arrow
6209
Nautical miles

Search flights

Distance from Ningbo to Burlington

There are several ways to calculate the distance from Ningbo to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 7145.435 miles
  • 11499.463 kilometers
  • 6209.213 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7131.392 miles
  • 11476.862 kilometers
  • 6197.010 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ningbo to Burlington?

The estimated flight time from Ningbo Lishe International Airport to Southeast Iowa Regional Airport is 14 hours and 1 minutes.

Flight carbon footprint between Ningbo Lishe International Airport (NGB) and Southeast Iowa Regional Airport (BRL)

On average, flying from Ningbo to Burlington generates about 875 kg of CO2 per passenger, and 875 kilograms equals 1 930 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ningbo to Burlington

See the map of the shortest flight path between Ningbo Lishe International Airport (NGB) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Ningbo Lishe International Airport
City: Ningbo
Country: China Flag of China
IATA Code: NGB
ICAO Code: ZSNB
Coordinates: 29°49′36″N, 121°27′43″E
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W