Air Miles Calculator logo

How far is Inukjuak from Texada?

The distance between Texada (Texada/Gillies Bay Airport) and Inukjuak (Inukjuak Airport) is 1939 miles / 3121 kilometers / 1685 nautical miles.

The driving distance from Texada (YGB) to Inukjuak (YPH) is 3280 miles / 5278 kilometers, and travel time by car is about 67 hours 49 minutes.

Texada/Gillies Bay Airport – Inukjuak Airport

Distance arrow
1939
Miles
Distance arrow
3121
Kilometers
Distance arrow
1685
Nautical miles

Search flights

Distance from Texada to Inukjuak

There are several ways to calculate the distance from Texada to Inukjuak. Here are two standard methods:

Vincenty's formula (applied above)
  • 1939.094 miles
  • 3120.670 kilometers
  • 1685.027 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1933.132 miles
  • 3111.074 kilometers
  • 1679.845 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Texada to Inukjuak?

The estimated flight time from Texada/Gillies Bay Airport to Inukjuak Airport is 4 hours and 10 minutes.

Flight carbon footprint between Texada/Gillies Bay Airport (YGB) and Inukjuak Airport (YPH)

On average, flying from Texada to Inukjuak generates about 212 kg of CO2 per passenger, and 212 kilograms equals 467 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Texada to Inukjuak

See the map of the shortest flight path between Texada/Gillies Bay Airport (YGB) and Inukjuak Airport (YPH).

Airport information

Origin Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W
Destination Inukjuak Airport
City: Inukjuak
Country: Canada Flag of Canada
IATA Code: YPH
ICAO Code: CYPH
Coordinates: 58°28′18″N, 78°4′36″W