Air Miles Calculator logo

How far is Texada from San Martin DeLos Andes?

The distance between San Martin DeLos Andes (Aviador Carlos Campos Airport) and Texada (Texada/Gillies Bay Airport) is 6980 miles / 11234 kilometers / 6066 nautical miles.

Aviador Carlos Campos Airport – Texada/Gillies Bay Airport

Distance arrow
6980
Miles
Distance arrow
11234
Kilometers
Distance arrow
6066
Nautical miles

Search flights

Distance from San Martin DeLos Andes to Texada

There are several ways to calculate the distance from San Martin DeLos Andes to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 6980.233 miles
  • 11233.596 kilometers
  • 6065.657 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6998.194 miles
  • 11262.502 kilometers
  • 6081.264 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Martin DeLos Andes to Texada?

The estimated flight time from Aviador Carlos Campos Airport to Texada/Gillies Bay Airport is 13 hours and 42 minutes.

Flight carbon footprint between Aviador Carlos Campos Airport (CPC) and Texada/Gillies Bay Airport (YGB)

On average, flying from San Martin DeLos Andes to Texada generates about 852 kg of CO2 per passenger, and 852 kilograms equals 1 878 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Martin DeLos Andes to Texada

See the map of the shortest flight path between Aviador Carlos Campos Airport (CPC) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Aviador Carlos Campos Airport
City: San Martin DeLos Andes
Country: Argentina Flag of Argentina
IATA Code: CPC
ICAO Code: SAZY
Coordinates: 40°4′31″S, 71°8′14″W
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W