How far is Texada from Shanghai?
The distance between Shanghai (Shanghai Hongqiao International Airport) and Texada (Texada/Gillies Bay Airport) is 5560 miles / 8948 kilometers / 4831 nautical miles.
Shanghai Hongqiao International Airport – Texada/Gillies Bay Airport
Search flights
Distance from Shanghai to Texada
There are several ways to calculate the distance from Shanghai to Texada. Here are two standard methods:
Vincenty's formula (applied above)- 5559.845 miles
- 8947.703 kilometers
- 4831.373 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5547.426 miles
- 8927.717 kilometers
- 4820.582 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shanghai to Texada?
The estimated flight time from Shanghai Hongqiao International Airport to Texada/Gillies Bay Airport is 11 hours and 1 minutes.
What is the time difference between Shanghai and Texada?
The time difference between Shanghai and Texada is 16 hours. Texada is 16 hours behind Shanghai.
Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Texada/Gillies Bay Airport (YGB)
On average, flying from Shanghai to Texada generates about 658 kg of CO2 per passenger, and 658 kilograms equals 1 450 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Shanghai to Texada
See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Texada/Gillies Bay Airport (YGB).
Airport information
Origin | Shanghai Hongqiao International Airport |
---|---|
City: | Shanghai |
Country: | China |
IATA Code: | SHA |
ICAO Code: | ZSSS |
Coordinates: | 31°11′52″N, 121°20′9″E |
Destination | Texada/Gillies Bay Airport |
---|---|
City: | Texada |
Country: | Canada |
IATA Code: | YGB |
ICAO Code: | CYGB |
Coordinates: | 49°41′39″N, 124°31′4″W |