Air Miles Calculator logo

How far is Cranbrook from Beijing?

The distance between Beijing (Beijing Nanyuan Airport) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 5509 miles / 8865 kilometers / 4787 nautical miles.

Beijing Nanyuan Airport – Cranbrook/Canadian Rockies International Airport

Distance arrow
5509
Miles
Distance arrow
8865
Kilometers
Distance arrow
4787
Nautical miles

Search flights

Distance from Beijing to Cranbrook

There are several ways to calculate the distance from Beijing to Cranbrook. Here are two standard methods:

Vincenty's formula (applied above)
  • 5508.530 miles
  • 8865.120 kilometers
  • 4786.782 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5494.016 miles
  • 8841.761 kilometers
  • 4774.169 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beijing to Cranbrook?

The estimated flight time from Beijing Nanyuan Airport to Cranbrook/Canadian Rockies International Airport is 10 hours and 55 minutes.

Flight carbon footprint between Beijing Nanyuan Airport (NAY) and Cranbrook/Canadian Rockies International Airport (YXC)

On average, flying from Beijing to Cranbrook generates about 651 kg of CO2 per passenger, and 651 kilograms equals 1 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Beijing to Cranbrook

See the map of the shortest flight path between Beijing Nanyuan Airport (NAY) and Cranbrook/Canadian Rockies International Airport (YXC).

Airport information

Origin Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E
Destination Cranbrook/Canadian Rockies International Airport
City: Cranbrook
Country: Canada Flag of Canada
IATA Code: YXC
ICAO Code: CYXC
Coordinates: 49°36′38″N, 115°46′55″W