Air Miles Calculator logo

How far is Ji'an from Sylhet?

The distance between Sylhet (Osmani International Airport) and Ji'an (Jinggangshan Airport) is 1428 miles / 2298 kilometers / 1241 nautical miles.

The driving distance from Sylhet (ZYL) to Ji'an (JGS) is 2156 miles / 3470 kilometers, and travel time by car is about 42 hours 56 minutes.

Osmani International Airport – Jinggangshan Airport

Distance arrow
1428
Miles
Distance arrow
2298
Kilometers
Distance arrow
1241
Nautical miles

Search flights

Distance from Sylhet to Ji'an

There are several ways to calculate the distance from Sylhet to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 1427.867 miles
  • 2297.930 kilometers
  • 1240.783 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1425.425 miles
  • 2293.999 kilometers
  • 1238.660 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sylhet to Ji'an?

The estimated flight time from Osmani International Airport to Jinggangshan Airport is 3 hours and 12 minutes.

Flight carbon footprint between Osmani International Airport (ZYL) and Jinggangshan Airport (JGS)

On average, flying from Sylhet to Ji'an generates about 175 kg of CO2 per passenger, and 175 kilograms equals 386 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sylhet to Ji'an

See the map of the shortest flight path between Osmani International Airport (ZYL) and Jinggangshan Airport (JGS).

Airport information

Origin Osmani International Airport
City: Sylhet
Country: Bangladesh Flag of Bangladesh
IATA Code: ZYL
ICAO Code: VGSY
Coordinates: 24°57′47″N, 91°52′0″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E