How far is Baise from Whitehorse?
The distance between Whitehorse (Erik Nielsen Whitehorse International Airport) and Baise (Baise Bama Airport) is 5669 miles / 9124 kilometers / 4926 nautical miles.
Erik Nielsen Whitehorse International Airport – Baise Bama Airport
Search flights
Distance from Whitehorse to Baise
There are several ways to calculate the distance from Whitehorse to Baise. Here are two standard methods:
Vincenty's formula (applied above)- 5669.205 miles
- 9123.701 kilometers
- 4926.405 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5659.339 miles
- 9107.823 kilometers
- 4917.831 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Whitehorse to Baise?
The estimated flight time from Erik Nielsen Whitehorse International Airport to Baise Bama Airport is 11 hours and 14 minutes.
What is the time difference between Whitehorse and Baise?
The time difference between Whitehorse and Baise is 15 hours. Baise is 15 hours ahead of Whitehorse.
Flight carbon footprint between Erik Nielsen Whitehorse International Airport (YXY) and Baise Bama Airport (AEB)
On average, flying from Whitehorse to Baise generates about 672 kg of CO2 per passenger, and 672 kilograms equals 1 482 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Whitehorse to Baise
See the map of the shortest flight path between Erik Nielsen Whitehorse International Airport (YXY) and Baise Bama Airport (AEB).
Airport information
Origin | Erik Nielsen Whitehorse International Airport |
---|---|
City: | Whitehorse |
Country: | Canada |
IATA Code: | YXY |
ICAO Code: | CYXY |
Coordinates: | 60°42′34″N, 135°4′1″W |
Destination | Baise Bama Airport |
---|---|
City: | Baise |
Country: | China |
IATA Code: | AEB |
ICAO Code: | ZGBS |
Coordinates: | 23°43′14″N, 106°57′35″E |