How far is Brandon from Cartagena?
The distance between Cartagena (Rafael Núñez International Airport) and Brandon (Brandon Municipal Airport) is 3058 miles / 4921 kilometers / 2657 nautical miles.
Rafael Núñez International Airport – Brandon Municipal Airport
Search flights
Distance from Cartagena to Brandon
There are several ways to calculate the distance from Cartagena to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 3057.710 miles
- 4920.908 kilometers
- 2657.078 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3063.403 miles
- 4930.069 kilometers
- 2662.024 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Cartagena to Brandon?
The estimated flight time from Rafael Núñez International Airport to Brandon Municipal Airport is 6 hours and 17 minutes.
What is the time difference between Cartagena and Brandon?
The time difference between Cartagena and Brandon is 1 hour. Brandon is 1 hour behind Cartagena.
Flight carbon footprint between Rafael Núñez International Airport (CTG) and Brandon Municipal Airport (YBR)
On average, flying from Cartagena to Brandon generates about 341 kg of CO2 per passenger, and 341 kilograms equals 752 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Cartagena to Brandon
See the map of the shortest flight path between Rafael Núñez International Airport (CTG) and Brandon Municipal Airport (YBR).
Airport information
Origin | Rafael Núñez International Airport |
---|---|
City: | Cartagena |
Country: | Colombia |
IATA Code: | CTG |
ICAO Code: | SKCG |
Coordinates: | 10°26′32″N, 75°30′46″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |