How far is Texada from Umiujaq?
The distance between Umiujaq (Umiujaq Airport) and Texada (Texada/Gillies Bay Airport) is 2008 miles / 3232 kilometers / 1745 nautical miles.
The driving distance from Umiujaq (YUD) to Texada (YGB) is 3311 miles / 5329 kilometers, and travel time by car is about 70 hours 1 minutes.
Umiujaq Airport – Texada/Gillies Bay Airport
Search flights
Distance from Umiujaq to Texada
There are several ways to calculate the distance from Umiujaq to Texada. Here are two standard methods:
Vincenty's formula (applied above)- 2007.987 miles
- 3231.541 kilometers
- 1744.893 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2001.727 miles
- 3221.468 kilometers
- 1739.454 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Umiujaq to Texada?
The estimated flight time from Umiujaq Airport to Texada/Gillies Bay Airport is 4 hours and 18 minutes.
What is the time difference between Umiujaq and Texada?
The time difference between Umiujaq and Texada is 3 hours. Texada is 3 hours behind Umiujaq.
Flight carbon footprint between Umiujaq Airport (YUD) and Texada/Gillies Bay Airport (YGB)
On average, flying from Umiujaq to Texada generates about 219 kg of CO2 per passenger, and 219 kilograms equals 482 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Umiujaq to Texada
See the map of the shortest flight path between Umiujaq Airport (YUD) and Texada/Gillies Bay Airport (YGB).
Airport information
Origin | Umiujaq Airport |
---|---|
City: | Umiujaq |
Country: | Canada |
IATA Code: | YUD |
ICAO Code: | CYMU |
Coordinates: | 56°32′9″N, 76°31′5″W |
Destination | Texada/Gillies Bay Airport |
---|---|
City: | Texada |
Country: | Canada |
IATA Code: | YGB |
ICAO Code: | CYGB |
Coordinates: | 49°41′39″N, 124°31′4″W |