How far is Texada from San Julian?
The distance between San Julian (Capitán José Daniel Vazquez Airport) and Texada (Texada/Gillies Bay Airport) is 7602 miles / 12234 kilometers / 6606 nautical miles.
Capitán José Daniel Vazquez Airport – Texada/Gillies Bay Airport
Search flights
Distance from San Julian to Texada
There are several ways to calculate the distance from San Julian to Texada. Here are two standard methods:
Vincenty's formula (applied above)- 7601.735 miles
- 12233.806 kilometers
- 6605.727 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7620.228 miles
- 12263.568 kilometers
- 6621.797 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from San Julian to Texada?
The estimated flight time from Capitán José Daniel Vazquez Airport to Texada/Gillies Bay Airport is 14 hours and 53 minutes.
What is the time difference between San Julian and Texada?
The time difference between San Julian and Texada is 5 hours. Texada is 5 hours behind San Julian.
Flight carbon footprint between Capitán José Daniel Vazquez Airport (ULA) and Texada/Gillies Bay Airport (YGB)
On average, flying from San Julian to Texada generates about 941 kg of CO2 per passenger, and 941 kilograms equals 2 074 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from San Julian to Texada
See the map of the shortest flight path between Capitán José Daniel Vazquez Airport (ULA) and Texada/Gillies Bay Airport (YGB).
Airport information
Origin | Capitán José Daniel Vazquez Airport |
---|---|
City: | San Julian |
Country: | Argentina |
IATA Code: | ULA |
ICAO Code: | SAWJ |
Coordinates: | 49°18′24″S, 67°48′9″W |
Destination | Texada/Gillies Bay Airport |
---|---|
City: | Texada |
Country: | Canada |
IATA Code: | YGB |
ICAO Code: | CYGB |
Coordinates: | 49°41′39″N, 124°31′4″W |