How far is Texada from Summer Beaver?
The distance between Summer Beaver (Summer Beaver Airport) and Texada (Texada/Gillies Bay Airport) is 1560 miles / 2510 kilometers / 1355 nautical miles.
The driving distance from Summer Beaver (SUR) to Texada (YGB) is 2039 miles / 3281 kilometers, and travel time by car is about 46 hours 42 minutes.
Summer Beaver Airport – Texada/Gillies Bay Airport
Search flights
Distance from Summer Beaver to Texada
There are several ways to calculate the distance from Summer Beaver to Texada. Here are two standard methods:
Vincenty's formula (applied above)- 1559.611 miles
- 2509.950 kilometers
- 1355.264 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1554.773 miles
- 2502.164 kilometers
- 1351.061 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Summer Beaver to Texada?
The estimated flight time from Summer Beaver Airport to Texada/Gillies Bay Airport is 3 hours and 27 minutes.
What is the time difference between Summer Beaver and Texada?
Flight carbon footprint between Summer Beaver Airport (SUR) and Texada/Gillies Bay Airport (YGB)
On average, flying from Summer Beaver to Texada generates about 183 kg of CO2 per passenger, and 183 kilograms equals 404 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Summer Beaver to Texada
See the map of the shortest flight path between Summer Beaver Airport (SUR) and Texada/Gillies Bay Airport (YGB).
Airport information
Origin | Summer Beaver Airport |
---|---|
City: | Summer Beaver |
Country: | Canada |
IATA Code: | SUR |
ICAO Code: | CJV7 |
Coordinates: | 52°42′30″N, 88°32′30″W |
Destination | Texada/Gillies Bay Airport |
---|---|
City: | Texada |
Country: | Canada |
IATA Code: | YGB |
ICAO Code: | CYGB |
Coordinates: | 49°41′39″N, 124°31′4″W |