Air Miles Calculator logo

How far is Texada from Djibouti?

The distance between Djibouti (Djibouti–Ambouli International Airport) and Texada (Texada/Gillies Bay Airport) is 8149 miles / 13115 kilometers / 7081 nautical miles.

Djibouti–Ambouli International Airport – Texada/Gillies Bay Airport

Distance arrow
8149
Miles
Distance arrow
13115
Kilometers
Distance arrow
7081
Nautical miles
Flight time duration
15 h 55 min
CO2 emission
1 020 kg

Search flights

Distance from Djibouti to Texada

There are several ways to calculate the distance from Djibouti to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 8148.974 miles
  • 13114.503 kilometers
  • 7081.265 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8139.792 miles
  • 13099.725 kilometers
  • 7073.286 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Djibouti to Texada?

The estimated flight time from Djibouti–Ambouli International Airport to Texada/Gillies Bay Airport is 15 hours and 55 minutes.

Flight carbon footprint between Djibouti–Ambouli International Airport (JIB) and Texada/Gillies Bay Airport (YGB)

On average, flying from Djibouti to Texada generates about 1 020 kg of CO2 per passenger, and 1 020 kilograms equals 2 249 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Djibouti to Texada

See the map of the shortest flight path between Djibouti–Ambouli International Airport (JIB) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Djibouti–Ambouli International Airport
City: Djibouti
Country: Djibouti Flag of Djibouti
IATA Code: JIB
ICAO Code: HDAM
Coordinates: 11°32′50″N, 43°9′34″E
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W