How far is Brandon from Longview, TX?
The distance between Longview (East Texas Regional Airport) and Brandon (Brandon Municipal Airport) is 1239 miles / 1994 kilometers / 1077 nautical miles.
The driving distance from Longview (GGG) to Brandon (YBR) is 1442 miles / 2321 kilometers, and travel time by car is about 27 hours 3 minutes.
East Texas Regional Airport – Brandon Municipal Airport
Search flights
Distance from Longview to Brandon
There are several ways to calculate the distance from Longview to Brandon. Here are two standard methods:
Vincenty's formula (applied above)- 1239.157 miles
- 1994.230 kilometers
- 1076.798 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1240.465 miles
- 1996.335 kilometers
- 1077.934 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Longview to Brandon?
The estimated flight time from East Texas Regional Airport to Brandon Municipal Airport is 2 hours and 50 minutes.
What is the time difference between Longview and Brandon?
Flight carbon footprint between East Texas Regional Airport (GGG) and Brandon Municipal Airport (YBR)
On average, flying from Longview to Brandon generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Longview to Brandon
See the map of the shortest flight path between East Texas Regional Airport (GGG) and Brandon Municipal Airport (YBR).
Airport information
Origin | East Texas Regional Airport |
---|---|
City: | Longview, TX |
Country: | United States |
IATA Code: | GGG |
ICAO Code: | KGGG |
Coordinates: | 32°23′2″N, 94°42′41″W |
Destination | Brandon Municipal Airport |
---|---|
City: | Brandon |
Country: | Canada |
IATA Code: | YBR |
ICAO Code: | CYBR |
Coordinates: | 49°54′36″N, 99°57′6″W |