Air Miles Calculator logo

How far is Beijing from Chandigarh?

The distance between Chandigarh (Chandigarh Airport) and Beijing (Beijing Nanyuan Airport) is 2306 miles / 3711 kilometers / 2004 nautical miles.

The driving distance from Chandigarh (IXC) to Beijing (NAY) is 3587 miles / 5773 kilometers, and travel time by car is about 67 hours 17 minutes.

Chandigarh Airport – Beijing Nanyuan Airport

Distance arrow
2306
Miles
Distance arrow
3711
Kilometers
Distance arrow
2004
Nautical miles
Flight time duration
4 h 51 min
Time Difference
2 h 30 min
CO2 emission
253 kg

Search flights

Distance from Chandigarh to Beijing

There are several ways to calculate the distance from Chandigarh to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2305.875 miles
  • 3710.946 kilometers
  • 2003.750 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2301.506 miles
  • 3703.915 kilometers
  • 1999.954 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chandigarh to Beijing?

The estimated flight time from Chandigarh Airport to Beijing Nanyuan Airport is 4 hours and 51 minutes.

Flight carbon footprint between Chandigarh Airport (IXC) and Beijing Nanyuan Airport (NAY)

On average, flying from Chandigarh to Beijing generates about 253 kg of CO2 per passenger, and 253 kilograms equals 557 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Chandigarh to Beijing

See the map of the shortest flight path between Chandigarh Airport (IXC) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Chandigarh Airport
City: Chandigarh
Country: India Flag of India
IATA Code: IXC
ICAO Code: VICG
Coordinates: 30°40′24″N, 76°47′18″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E