How far is Brandon from Oaxaca?
The distance between Oaxaca (Oaxaca International Airport) and Brandon (Brandon Municipal Airport) is 2275 miles / 3662 kilometers / 1977 nautical miles.
The driving distance from Oaxaca (OAX) to Brandon (YBR) is 2706 miles / 4355 kilometers, and travel time by car is about 51 hours 3 minutes.
Oaxaca International Airport – Brandon Municipal Airport
Search flights
Distance from Oaxaca to Brandon
There are several ways to calculate the distance from Oaxaca to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 2275.486 miles
- 3662.039 kilometers
- 1977.343 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2280.969 miles
- 3670.864 kilometers
- 1982.108 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Oaxaca to Brandon?
The estimated flight time from Oaxaca International Airport to Brandon Municipal Airport is 4 hours and 48 minutes.
What is the time difference between Oaxaca and Brandon?
Flight carbon footprint between Oaxaca International Airport (OAX) and Brandon Municipal Airport (YBR)
On average, flying from Oaxaca to Brandon generates about 249 kg of CO2 per passenger, and 249 kilograms equals 549 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Oaxaca to Brandon
See the map of the shortest flight path between Oaxaca International Airport (OAX) and Brandon Municipal Airport (YBR).
Airport information
Origin | Oaxaca International Airport |
---|---|
City: | Oaxaca |
Country: | Mexico |
IATA Code: | OAX |
ICAO Code: | MMOX |
Coordinates: | 16°59′59″N, 96°43′35″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |