Air Miles Calculator logo

How far is Brainerd, MN, from Cartagena?

The distance between Cartagena (Rafael Núñez International Airport) and Brainerd (Brainerd Lakes Regional Airport) is 2708 miles / 4359 kilometers / 2354 nautical miles.

Rafael Núñez International Airport – Brainerd Lakes Regional Airport

Distance arrow
2708
Miles
Distance arrow
4359
Kilometers
Distance arrow
2354
Nautical miles

Search flights

Distance from Cartagena to Brainerd

There are several ways to calculate the distance from Cartagena to Brainerd. Here are two standard methods:

Vincenty's formula (applied above)
  • 2708.387 miles
  • 4358.726 kilometers
  • 2353.524 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2714.686 miles
  • 4368.863 kilometers
  • 2358.997 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cartagena to Brainerd?

The estimated flight time from Rafael Núñez International Airport to Brainerd Lakes Regional Airport is 5 hours and 37 minutes.

Flight carbon footprint between Rafael Núñez International Airport (CTG) and Brainerd Lakes Regional Airport (BRD)

On average, flying from Cartagena to Brainerd generates about 300 kg of CO2 per passenger, and 300 kilograms equals 661 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Cartagena to Brainerd

See the map of the shortest flight path between Rafael Núñez International Airport (CTG) and Brainerd Lakes Regional Airport (BRD).

Airport information

Origin Rafael Núñez International Airport
City: Cartagena
Country: Colombia Flag of Colombia
IATA Code: CTG
ICAO Code: SKCG
Coordinates: 10°26′32″N, 75°30′46″W
Destination Brainerd Lakes Regional Airport
City: Brainerd, MN
Country: United States Flag of United States
IATA Code: BRD
ICAO Code: KBRD
Coordinates: 46°23′53″N, 94°8′17″W