Air Miles Calculator logo

How far is Brandon from Pointe-à-Pitre?

The distance between Pointe-à-Pitre (Pointe-à-Pitre International Airport) and Brandon (Brandon Municipal Airport) is 3155 miles / 5078 kilometers / 2742 nautical miles.

Pointe-à-Pitre International Airport – Brandon Municipal Airport

Distance arrow
3155
Miles
Distance arrow
5078
Kilometers
Distance arrow
2742
Nautical miles

Search flights

Distance from Pointe-à-Pitre to Brandon

There are several ways to calculate the distance from Pointe-à-Pitre to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 3155.161 miles
  • 5077.739 kilometers
  • 2741.760 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3156.378 miles
  • 5079.697 kilometers
  • 2742.817 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pointe-à-Pitre to Brandon?

The estimated flight time from Pointe-à-Pitre International Airport to Brandon Municipal Airport is 6 hours and 28 minutes.

Flight carbon footprint between Pointe-à-Pitre International Airport (PTP) and Brandon Municipal Airport (YBR)

On average, flying from Pointe-à-Pitre to Brandon generates about 353 kg of CO2 per passenger, and 353 kilograms equals 778 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pointe-à-Pitre to Brandon

See the map of the shortest flight path between Pointe-à-Pitre International Airport (PTP) and Brandon Municipal Airport (YBR).

Airport information

Origin Pointe-à-Pitre International Airport
City: Pointe-à-Pitre
Country: Guadeloupe Flag of Guadeloupe
IATA Code: PTP
ICAO Code: TFFR
Coordinates: 16°15′55″N, 61°31′54″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W