How far is Brandon from Puebla?
The distance between Puebla (Puebla International Airport) and Brandon (Brandon Municipal Airport) is 2122 miles / 3415 kilometers / 1844 nautical miles.
The driving distance from Puebla (PBC) to Brandon (YBR) is 2513 miles / 4045 kilometers, and travel time by car is about 47 hours 2 minutes.
Puebla International Airport – Brandon Municipal Airport
Search flights
Distance from Puebla to Brandon
There are several ways to calculate the distance from Puebla to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 2121.715 miles
- 3414.569 kilometers
- 1843.720 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2126.545 miles
- 3422.342 kilometers
- 1847.917 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Puebla to Brandon?
The estimated flight time from Puebla International Airport to Brandon Municipal Airport is 4 hours and 31 minutes.
What is the time difference between Puebla and Brandon?
Flight carbon footprint between Puebla International Airport (PBC) and Brandon Municipal Airport (YBR)
On average, flying from Puebla to Brandon generates about 231 kg of CO2 per passenger, and 231 kilograms equals 510 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Puebla to Brandon
See the map of the shortest flight path between Puebla International Airport (PBC) and Brandon Municipal Airport (YBR).
Airport information
Origin | Puebla International Airport |
---|---|
City: | Puebla |
Country: | Mexico |
IATA Code: | PBC |
ICAO Code: | MMPB |
Coordinates: | 19°9′29″N, 98°22′17″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |