Air Miles Calculator logo

How far is Burlington, IA, from Cartagena?

The distance between Cartagena (Rafael Núñez International Airport) and Burlington (Southeast Iowa Regional Airport) is 2296 miles / 3696 kilometers / 1996 nautical miles.

Rafael Núñez International Airport – Southeast Iowa Regional Airport

Distance arrow
2296
Miles
Distance arrow
3696
Kilometers
Distance arrow
1996
Nautical miles

Search flights

Distance from Cartagena to Burlington

There are several ways to calculate the distance from Cartagena to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 2296.472 miles
  • 3695.814 kilometers
  • 1995.580 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2302.583 miles
  • 3705.648 kilometers
  • 2000.890 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cartagena to Burlington?

The estimated flight time from Rafael Núñez International Airport to Southeast Iowa Regional Airport is 4 hours and 50 minutes.

Flight carbon footprint between Rafael Núñez International Airport (CTG) and Southeast Iowa Regional Airport (BRL)

On average, flying from Cartagena to Burlington generates about 252 kg of CO2 per passenger, and 252 kilograms equals 555 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Cartagena to Burlington

See the map of the shortest flight path between Rafael Núñez International Airport (CTG) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Rafael Núñez International Airport
City: Cartagena
Country: Colombia Flag of Colombia
IATA Code: CTG
ICAO Code: SKCG
Coordinates: 10°26′32″N, 75°30′46″W
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W