How far is Windsor from Beijing?
The distance between Beijing (Beijing Daxing International Airport) and Windsor (Windsor International Airport) is 6673 miles / 10739 kilometers / 5799 nautical miles.
Beijing Daxing International Airport – Windsor International Airport
Search flights
Distance from Beijing to Windsor
There are several ways to calculate the distance from Beijing to Windsor. Here are two standard methods:
Vincenty's formula (applied above)- 6672.859 miles
- 10738.926 kilometers
- 5798.556 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6657.112 miles
- 10713.583 kilometers
- 5784.872 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Windsor?
The estimated flight time from Beijing Daxing International Airport to Windsor International Airport is 13 hours and 8 minutes.
What is the time difference between Beijing and Windsor?
The time difference between Beijing and Windsor is 13 hours. Windsor is 13 hours behind Beijing.
Flight carbon footprint between Beijing Daxing International Airport (PKX) and Windsor International Airport (YQG)
On average, flying from Beijing to Windsor generates about 809 kg of CO2 per passenger, and 809 kilograms equals 1 784 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Beijing to Windsor
See the map of the shortest flight path between Beijing Daxing International Airport (PKX) and Windsor International Airport (YQG).
Airport information
Origin | Beijing Daxing International Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | PKX |
ICAO Code: | ZBAD |
Coordinates: | 39°30′33″N, 116°24′38″E |
Destination | Windsor International Airport |
---|---|
City: | Windsor |
Country: | Canada |
IATA Code: | YQG |
ICAO Code: | CYQG |
Coordinates: | 42°16′32″N, 82°57′20″W |