Air Miles Calculator logo

How far is Zhanjiang from Siliguri?

The distance between Siliguri (Bagdogra Airport) and Zhanjiang (Zhanjiang Airport) is 1441 miles / 2319 kilometers / 1252 nautical miles.

The driving distance from Siliguri (IXB) to Zhanjiang (ZHA) is 2199 miles / 3539 kilometers, and travel time by car is about 43 hours 25 minutes.

Bagdogra Airport – Zhanjiang Airport

Distance arrow
1441
Miles
Distance arrow
2319
Kilometers
Distance arrow
1252
Nautical miles
Flight time duration
3 h 13 min
CO2 emission
176 kg

Search flights

Distance from Siliguri to Zhanjiang

There are several ways to calculate the distance from Siliguri to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1441.088 miles
  • 2319.207 kilometers
  • 1252.272 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1439.236 miles
  • 2316.225 kilometers
  • 1250.662 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Siliguri to Zhanjiang?

The estimated flight time from Bagdogra Airport to Zhanjiang Airport is 3 hours and 13 minutes.

Flight carbon footprint between Bagdogra Airport (IXB) and Zhanjiang Airport (ZHA)

On average, flying from Siliguri to Zhanjiang generates about 176 kg of CO2 per passenger, and 176 kilograms equals 388 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Siliguri to Zhanjiang

See the map of the shortest flight path between Bagdogra Airport (IXB) and Zhanjiang Airport (ZHA).

Airport information

Origin Bagdogra Airport
City: Siliguri
Country: India Flag of India
IATA Code: IXB
ICAO Code: VEBD
Coordinates: 26°40′52″N, 88°19′42″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E