Air Miles Calculator logo

How far is Beijing from Chandigarh?

The distance between Chandigarh (Chandigarh Airport) and Beijing (Beijing Daxing International Airport) is 2306 miles / 3711 kilometers / 2004 nautical miles.

The driving distance from Chandigarh (IXC) to Beijing (PKX) is 3578 miles / 5758 kilometers, and travel time by car is about 66 hours 55 minutes.

Chandigarh Airport – Beijing Daxing International Airport

Distance arrow
2306
Miles
Distance arrow
3711
Kilometers
Distance arrow
2004
Nautical miles
Flight time duration
4 h 51 min
Time Difference
2 h 30 min
CO2 emission
253 kg

Search flights

Distance from Chandigarh to Beijing

There are several ways to calculate the distance from Chandigarh to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2306.068 miles
  • 3711.257 kilometers
  • 2003.918 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2301.673 miles
  • 3704.184 kilometers
  • 2000.099 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chandigarh to Beijing?

The estimated flight time from Chandigarh Airport to Beijing Daxing International Airport is 4 hours and 51 minutes.

Flight carbon footprint between Chandigarh Airport (IXC) and Beijing Daxing International Airport (PKX)

On average, flying from Chandigarh to Beijing generates about 253 kg of CO2 per passenger, and 253 kilograms equals 557 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Chandigarh to Beijing

See the map of the shortest flight path between Chandigarh Airport (IXC) and Beijing Daxing International Airport (PKX).

Airport information

Origin Chandigarh Airport
City: Chandigarh
Country: India Flag of India
IATA Code: IXC
ICAO Code: VICG
Coordinates: 30°40′24″N, 76°47′18″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E