Air Miles Calculator logo

How far is Texada from Pointe-à-Pitre?

The distance between Pointe-à-Pitre (Pointe-à-Pitre International Airport) and Texada (Texada/Gillies Bay Airport) is 4168 miles / 6708 kilometers / 3622 nautical miles.

Pointe-à-Pitre International Airport – Texada/Gillies Bay Airport

Distance arrow
4168
Miles
Distance arrow
6708
Kilometers
Distance arrow
3622
Nautical miles

Search flights

Distance from Pointe-à-Pitre to Texada

There are several ways to calculate the distance from Pointe-à-Pitre to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 4168.293 miles
  • 6708.217 kilometers
  • 3622.148 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4165.486 miles
  • 6703.700 kilometers
  • 3619.709 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pointe-à-Pitre to Texada?

The estimated flight time from Pointe-à-Pitre International Airport to Texada/Gillies Bay Airport is 8 hours and 23 minutes.

Flight carbon footprint between Pointe-à-Pitre International Airport (PTP) and Texada/Gillies Bay Airport (YGB)

On average, flying from Pointe-à-Pitre to Texada generates about 477 kg of CO2 per passenger, and 477 kilograms equals 1 053 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pointe-à-Pitre to Texada

See the map of the shortest flight path between Pointe-à-Pitre International Airport (PTP) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Pointe-à-Pitre International Airport
City: Pointe-à-Pitre
Country: Guadeloupe Flag of Guadeloupe
IATA Code: PTP
ICAO Code: TFFR
Coordinates: 16°15′55″N, 61°31′54″W
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W