How far is Brandon from Punta Cana?
The distance between Punta Cana (Punta Cana International Airport) and Brandon (Brandon Municipal Airport) is 2777 miles / 4469 kilometers / 2413 nautical miles.
Punta Cana International Airport – Brandon Municipal Airport
Search flights
Distance from Punta Cana to Brandon
There are several ways to calculate the distance from Punta Cana to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 2777.074 miles
- 4469.267 kilometers
- 2413.211 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2778.633 miles
- 4471.777 kilometers
- 2414.566 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Punta Cana to Brandon?
The estimated flight time from Punta Cana International Airport to Brandon Municipal Airport is 5 hours and 45 minutes.
What is the time difference between Punta Cana and Brandon?
The time difference between Punta Cana and Brandon is 2 hours. Brandon is 2 hours behind Punta Cana.
Flight carbon footprint between Punta Cana International Airport (PUJ) and Brandon Municipal Airport (YBR)
On average, flying from Punta Cana to Brandon generates about 308 kg of CO2 per passenger, and 308 kilograms equals 678 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Punta Cana to Brandon
See the map of the shortest flight path between Punta Cana International Airport (PUJ) and Brandon Municipal Airport (YBR).
Airport information
Origin | Punta Cana International Airport |
---|---|
City: | Punta Cana |
Country: | Dominican Republic |
IATA Code: | PUJ |
ICAO Code: | MDPC |
Coordinates: | 18°34′2″N, 68°21′48″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |