How far is Nanaimo from Beijing?
The distance between Beijing (Beijing Daxing International Airport) and Nanaimo (Nanaimo Airport) is 5314 miles / 8552 kilometers / 4618 nautical miles.
Beijing Daxing International Airport – Nanaimo Airport
Search flights
Distance from Beijing to Nanaimo
There are several ways to calculate the distance from Beijing to Nanaimo. Here are two standard methods:
Vincenty's formula (applied above)- 5314.183 miles
- 8552.348 kilometers
- 4617.898 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5300.241 miles
- 8529.910 kilometers
- 4605.783 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Beijing to Nanaimo?
The estimated flight time from Beijing Daxing International Airport to Nanaimo Airport is 10 hours and 33 minutes.
What is the time difference between Beijing and Nanaimo?
The time difference between Beijing and Nanaimo is 16 hours. Nanaimo is 16 hours behind Beijing.
Flight carbon footprint between Beijing Daxing International Airport (PKX) and Nanaimo Airport (YCD)
On average, flying from Beijing to Nanaimo generates about 625 kg of CO2 per passenger, and 625 kilograms equals 1 378 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Beijing to Nanaimo
See the map of the shortest flight path between Beijing Daxing International Airport (PKX) and Nanaimo Airport (YCD).
Airport information
Origin | Beijing Daxing International Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | PKX |
ICAO Code: | ZBAD |
Coordinates: | 39°30′33″N, 116°24′38″E |
Destination | Nanaimo Airport |
---|---|
City: | Nanaimo |
Country: | Canada |
IATA Code: | YCD |
ICAO Code: | CYCD |
Coordinates: | 49°3′8″N, 123°52′12″W |